By Martijn Versteegen, CEO and co-founder at Imagin Studio
Are you already using artificial intelligence (AI) to work more efficiently? For fleet managers and leasing companies, AI offers opportunities to improve the way they work, like smart chatbots, automated emails, or quick generation of attractive visuals.
It sounds appealing, but have you fully considered the risks?
One issue involves intellectual property. AI models learn from massive datasets that frequently include copyrighted texts and images.
Currently, prominent AI firms face legal challenges, the most recent being Getty Images versus Stability AI reaching the High Court in London.
Getty alleges Stability AI illegally scraped millions of images from its archives to train its image generation model without appropriate licensing.
Such cases, alongside notable US lawsuits like the New York Times against OpenAI and Microsoft, highlight growing concerns about copyright infringements and potential legal exposure for businesses using AI-generated content.
The final High Court ruling in the Stability AI case could set precedent for UK businesses relying on AI-generated content.
Another challenge is AI's propensity to "hallucinate", that is, fabricating information that can result in misleading or incorrect outputs.
A recent, somewhat amusing example was the Chicago Sun-Times journalist who included several non-existent book titles in a summer reading preview, unaware that the AI tool he used had made them up.
But these kinds of errors can have serious consequences and cause reputational damage.
Fleet managers relying on AI-generated data and imagery must anticipate additional checks and corrections. This requirement could offset some of the time and cost benefits AI initially promises.
There’s also the issue of bias and discrimination. AI learns patterns from existing human-generated data, often perpetuating embedded biases and stereotypes unintentionally.
Such bias in generated content can inadvertently marginalise or exclude specific groups, potentially leading to public relations challenges or regulatory scrutiny.
For fleet managers, using AI tools comes with a caveat. It means checking outputs carefully, questioning the data behind automated decisions, and being aware of where content comes from. The technology may be fast, but it’s not foolproof.
One way to reduce the risk is to work with specialist providers who focus on copyright-compliant content, taking part of the legal and reputational burden off your plate.
AI can be a useful tool, but it still needs human oversight to avoid mistakes, bias, and the wrong kind of exposure.
Login to comment
Comments
No comments have been made yet.