For Christmas I got a fascinating gift from a pal - my extremely own "very popular" book.
"Tech-Splaining for Dummies" (great title) bears my name and my image on its cover, and it has glowing reviews.
Yet it was totally composed by AI, with a few simple prompts about me supplied by my friend Janet.
It's an interesting read, and uproarious in parts. But it also meanders rather a lot, and is somewhere in between a self-help book and a stream of anecdotes.
It imitates my chatty design of composing, valetinowiki.racing however it's likewise a bit repeated, and very verbose. It might have gone beyond Janet's prompts in collating information about me.
Several sentences begin "as a leading technology reporter ..." - cringe - which might have been scraped from an online bio.
There's likewise a strange, repetitive hallucination in the kind of my cat (I have no pets). And there's a metaphor on nearly every page - some more random than others.
There are dozens of companies online offering AI-book writing services. My book was from BookByAnyone.
When I got in touch with the president Adir Mashiach, based in Israel, he told me he had offered around 150,000 customised books, mainly in the US, because pivoting from compiling AI-generated travel guides in June 2024.
A paperback copy of your own 240-page long best-seller expenses ₤ 26. The company uses its own AI tools to produce them, based on an open source big language design.
I'm not asking you to buy my book. Actually you can't - just Janet, tandme.co.uk who created it, can buy any further copies.
There is currently no barrier to anyone developing one in any person's name, including celebs - although Mr Mashiach states there are guardrails around abusive content. Each book includes a printed disclaimer stating that it is fictional, created by AI, and created "solely to bring humour and delight".
Legally, the copyright comes from the company, however Mr Mashiach worries that the item is meant as a "personalised gag present", and the books do not get sold further.
He hopes to widen his variety, generating different categories such as sci-fi, and perhaps using an autobiography service. It's developed to be a light-hearted kind of consumer AI - offering AI-generated items to human consumers.
It's also a bit terrifying if, like me, you compose for utahsyardsale.com a living. Not least since it probably took less than a minute to create, and it does, certainly in some parts, sound similar to me.
Musicians, authors, artists and stars worldwide have revealed alarm about their work being used to train generative AI tools that then produce comparable content based upon it.
"We should be clear, when we are discussing information here, we in fact suggest human creators' life works," states Ed Newton Rex, creator of Fairly Trained, which projects for AI firms to regard creators' rights.
"This is books, this is posts, this is pictures. It's works of art. It's records ... The entire point of AI training is to find out how to do something and after that do more like that."
In 2023 a song featuring AI-generated voices of Canadian vocalists Drake and The Weeknd went viral on social media before being pulled from streaming platforms since it was not their work and they had not consented to it. It didn't stop the track's creator trying to choose it for a Grammy award. And even though the artists were phony, it was still wildly popular.
"I do not think the use of generative AI for innovative functions must be prohibited, however I do think that generative AI for these functions that is trained on people's work without permission must be banned," Mr Newton Rex adds. "AI can be really powerful however let's construct it ethically and fairly."
OpenAI states Chinese rivals using its work for asteroidsathome.net their AI apps
DeepSeek: The Chinese AI app that has the world talking
China's DeepSeek AI shakes industry and dents America's swagger
In the UK some organisations - including the BBC - have picked to block AI developers from trawling their online content for training functions. Others have chosen to work together - the Financial Times has partnered with ChatGPT developer OpenAI for example.
The UK government is thinking about an overhaul of the law that would permit AI designers to use developers' content on the web to help establish their models, unless the rights holders decide out.
Ed Newton Rex describes this as "insanity".
He explains that AI can make advances in locations like defence, health care and logistics without trawling the work of authors, reporters and artists.
"All of these things work without going and changing copyright law and destroying the incomes of the nation's creatives," he argues.
Baroness Kidron, a crossbench peer in your house of Lords, is also strongly versus removing copyright law for AI.
"Creative industries are wealth developers, 2.4 million tasks and a great deal of joy," says the Baroness, who is also a consultant to the Institute for Ethics in AI at Oxford University.
"The federal government is weakening one of its best performing industries on the unclear pledge of growth."
A federal government representative stated: "No move will be made till we are absolutely confident we have a practical plan that provides each of our goals: increased control for right holders to help them license their material, access to top quality material to train leading AI models in the UK, and more transparency for right holders from AI developers."
Under the UK government's new AI strategy, a national data library containing public information from a large range of sources will also be offered to AI scientists.
In the US the future of federal rules to manage AI is now up in the air following President Trump's return to the presidency.
In 2023 Biden signed an executive order that intended to boost the safety of AI with, to name a few things, firms in the sector needed to share details of the workings of their systems with the US federal government before they are released.
But this has now been repealed by Trump. It remains to be seen what Trump will do instead, but he is stated to want the AI sector to face less regulation.
This comes as a variety of claims versus AI companies, and particularly versus OpenAI, continue in the US. They have been taken out by everybody from the New york city Times to authors, music labels, and even a comic.
They claim that the AI companies broke the law when they took their material from the internet without their approval, and used it to train their systems.
The AI companies argue that their actions fall under "reasonable use" and are therefore exempt. There are a variety of aspects which can make up reasonable usage - it's not a straight-forward definition. But the AI sector is under increasing examination over how it gathers training information and whether it ought to be paying for it.
If this wasn't all adequate to ponder, Chinese AI firm DeepSeek has shaken the sector over the past week. It ended up being the many downloaded totally free app on Apple's US App Store.
DeepSeek claims that it developed its innovation for a portion of the cost of the similarity OpenAI. Its success has actually raised security concerns in the US, and threatens American's present supremacy of the sector.
As for me and a profession as an author, I believe that at the minute, if I really desire a "bestseller" I'll still need to write it myself. If anything, Tech-Splaining for Dummies highlights the present weak point in generative AI tools for . It has lots of inaccuracies and hallucinations, and it can be quite hard to check out in parts because it's so verbose.
But given how rapidly the tech is progressing, I'm unsure how long I can stay confident that my considerably slower human writing and modifying abilities, are better.
Register for our Tech Decoded newsletter to follow the greatest advancements in global innovation, with analysis from BBC reporters all over the world.
Outside the UK? Sign up here.
1
How an AI written Book Shows why the Tech 'Frightens' Creatives
yvettebrinson edited this page 2 months ago