Abirpothi

Can Indian Laws Safeguard our Artists in the Age of Artificial Intelligence?

Tsuktiben Jamir

The emergence of artificial intelligence (AI) as a possible component of human existence is something we are now seeing in history. No matter how much we resist it, its potential and reach cannot be denied. It has seeped into entertainment, marketing, education, and, most of all, art-making.

Interestingly, AI is accused of plagiarising or stealing the work of other artists. In actuality, human artists have done exactly the same thing. One of the first lessons a human artist learns in art school is about other painters and their various styles, employing various methods and brush strokes used by many artists in their own artwork. Not to mention the external inspirations and inspiration from the artists who came before them, which may have even motivated them to start creating art in the first place. In the end, people perform the same action as the AI. Everything and everyone have an impact, whether one is a human or an AI.

So, when and how do we draw the line between a firm’s ability to continue exploiting the work of authors, painters, and singers to train their AI models without the creators’ consent when the work seems so uncannily similar, and the AI can duplicate anything (given it is fed the correct sort of data)? Thus, the issue arises: Can You Sue AI For Attempting to Copy Your Work in India? The answer is No.

According to experts consulted by The Quint, India should remember that the Copyrights Act only protects the actual work of expression, not the concepts, ideas, or theories that inspired it, if and when it decides to include AI-generated content in its copyright laws. “When it comes to writing, can a writer claim copyright over AI copying their style? Essentially, isn’t all writing inspired by someone else’s style? This is why laws protect the tangible work, not the process that leads to it,” Delhi- based Intellectual Property Rights lawyer Karmanya Dev Sharma told The Quint.

However, several cases have sprung up concerning this issue. For instance, three of the most well-known art generators, Stable Diffusion, Midjourney, and DreamUp, were accused of copyright infringement in January by US artists Sarah Anderson, Kelly McKernan, and Karla Ortiz for using millions of copyrighted images for internal training purposes without their knowledge or consent.

Meanwhile, Getty Images, the provider of stock photos, filed a lawsuit against Stability AI (the company that developed Stable Diffusion, an AI-based system for producing images from text inputs), accusing it of stealing a number of its images without permission and using them to train the software to produce more accurate depictions based on user prompts.

“While copyright can only be claimed on tangible work and not ‘style’, in these cases, the creators are trying to convey that their work is being ‘copied’ without consent to generate more similar images. That’s another way to sue these companies for violating their Intellectual Property rights,” IP rights expert Anand Nishanth pointed out.

A way around it is, as Nishanth told The Quint, “One of the ways to go about it is to issue a commercial license for the material it (the AI) uses and compensate the creators.”

As far as India is concerned, AI is classified as “machine-generated work” under the Indian Copyright Act. As such, the “person that causes the work to be created” is given authorship of these works. The criteria for authorship and whether this title would only belong to a “human person” are still up for debate.  Furthermore, High Courts are giving copyright registrations the status of official “licences” that might offer protection from copyright infringement.

Conclusion: The diversity, speed, and dubious authorship of “machine-generated works” with “bulletproof” copyright registrations might be deadly. For now, we stand where the range of “machine-generated works” that may be produced quickly, along with questions about authorship and “bulletproof” copyright registrations, is dangerous.

Leave a Comment

Your email address will not be published. Required fields are marked *