The Ghost Singer

A voice reaches the podium

A singer who does not exist has just climbed onto the podium.

On April 17, 2026, “Celebrate Me,” attributed to IngaRose, reached number one on the U.S. and global iTunes charts, according to Dexerto. Apple Music lists the single as a March 31, 2026 release under Myers Music. According to Dexerto, the project has been linked to Dallas Little, a producer based in South Carolina, and the songs are presented as human-written lyrics refined with Suno, an AI music platform. (Dexerto)

A song.

A voice.

An emotion.

And behind it? Not really an artist in the usual sense.

That is the disturbing detail. For a long time, we associated music with an identifiable human presence: a person, a story, a face, a wound, a joy, a trajectory. Even when the industry manufactured artists, there was still a body, a voice, a possible stage presence.

Here, the scenery changes.

The work arrives before the artist.

Emotion arrives before identity.

The public listens before asking who is singing.

The iTunes podium is not Billboard

We need precision.

iTunes ranks purchases, not streaming plays. Apple clearly distinguishes iTunes Store purchases from Apple Music plays: a purchased track is counted as a “Purchase,” while an Apple Music listen is counted as a “Play.” (Apple Community)

In other words, an iTunes number one does not automatically mean total cultural domination. Its podium is more accessible than a massive Spotify ranking or a Billboard chart that integrates multiple signals.

So no, the music industry did not collapse live on April 17, 2026.

But it did receive a signal.

A cold signal.

A disturbing signal.

A signal many will prefer to minimize because it touches a sensitive area: the very definition of the artist.

When emotion becomes independent from the author

The key point is not only that AI can help produce a song.

The key point is that the listener can be moved before knowing who, or what, participated in the production.

That is a cultural reversal.

For decades, the music industry sold more than a track. It sold embodiment. A voice was not just an audio signal: it was the trace of a body, a story, a wound, an intention.

With generative AI, this equation becomes unstable.

A voice can be synthetic.

A biography can be narrative.

An image can be generated.

A social presence can be orchestrated.

An emotion can be real for the listener while being produced by a hybrid creation chain.

That is where the debate becomes difficult. Because the emotion felt by the audience is not synthetic. It is human. The unease comes from this gap: an artificial production can trigger an authentic reaction.

Bandcamp closes the door, Spotify chooses transparency

Platforms are not responding in the same way.

Bandcamp has adopted a strict position. Its policy states that music and audio created wholly or substantially by artificial intelligence are not permitted. Bandcamp also says it reserves the right to remove music suspected of being AI-generated. (Bandcamp)

Spotify is taking a different path. The platform announced a policy focused on three areas: stronger enforcement against vocal impersonation, a music spam filter, and AI disclosures in music credits. Spotify also states that transparency is not meant to automatically penalize artists who use AI responsibly. (Spotify)

Two visions are colliding.

On one side: protect a human space by strongly filtering generated content.

On the other: accept the use of AI, but make its role more legible.

The first approach favors boundaries.

The second favors traceability.

Neither solves everything. A ban can be difficult to enforce. Transparency can be insufficient if it arrives too late in the listening experience.

Should “created with AI” appear before listening?

The question deserves more than a reflex.

A clear label before listening has one advantage: it gives the audience decisive information at the moment of choice. It avoids the origin of the work being discovered afterward, as a form of emotional deception.

But it also carries a risk: creating an automatic bias. Many people will reject a track before listening to it, not because it is bad, but because it is associated with AI.

Yet the absence of information creates an equally serious problem.

In music, we do not consume only a sound file. We consume an imagined relationship with an artist. We want to believe that this voice comes from somewhere. We want to connect emotion to intention.

If that intention is human, AI-assisted, AI-generated, or assembled by a production team using synthetic tools, it is not the same thing.

The public can accept many things.

It still needs to know what it is listening to.

The debate is not only about art

The IngaRose case goes beyond music.

It announces what will happen across creative industries: images, video, advertising, training, journalism, influence, audiobooks, customer service, design, consulting, entertainment.

The question will no longer be only: “Can AI produce?”

The question will become: “What rules do we want to apply when AI produces something close enough to human creation to trigger attention, trust or emotion?”

This is exactly the type of shift I discuss in my book in chapter 14, dedicated to the application of artificial intelligence. The table of contents places this chapter within the part dedicated to the structure of the innovational intelligence system, after the pillars of vision, culture, communication, organization, methods and talent.

AI is not just a tool.

It forces organizations to redefine their rules.

It forces platforms to choose their doctrine.

It forces brands to clarify their relationship with truth.

It forces leaders to decide before the market decides for them.

The strategic karaoke of organizations

The most dangerous move for a company is to treat this subject as a cultural curiosity.

A synthetic singer topping an iTunes chart is amusing for five minutes.

Then governance questions begin.

Who is allowed to use AI in the organization?

To produce what?

With what level of transparency?

With what data?

With what human validation?

With what responsibility in case of error, plagiarism, reputational damage or customer confusion?

Without answers, teams improvise.

Marketing tests.

Communications publishes.

Sales teams personalize.

HR automates.

Suppliers deliver.

And one day, the organization discovers that it has already shifted into AI, but without a shared framework.

Waiting for algorithms to write your strategy would be a risky form of managerial karaoke.

You think you are singing your own song.

In reality, someone else chose the lyrics.

The author trembles, but does not disappear

It would be easy to conclude that the author is dead.

I do not believe that.

The author changes position.

Sometimes the author becomes a designer, selector, director, editor, artistic director, AI trainer, experience architect.

The problem is not that AI enters creation.

The problem is that we do not yet have the words, rules and reflexes to clearly distinguish degrees of human intervention.

A song written by a human, composed with AI assistance, sung by a synthetic voice and distributed under the identity of a virtual character is not the same thing as a song entirely generated from a prompt.

Putting everything in the same box would be lazy.

Leaving everything vague would be irresponsible.

What leaders should remember

The “Celebrate Me” case does not announce the end of human music.

It announces the end of naivety.

Organizations must move past three illusions.

First illusion: believing AI will remain limited to technical tasks. It is already entering emotion, image, voice, narrative and relationships.

Second illusion: believing the public will automatically reject synthetic creations. The public can listen, share, buy, comment, dance, cry, then only afterward discover the origin of the track.

Third illusion: believing regulation or platforms will solve everything. They will create frameworks, but every organization will need to build its own doctrine.

In the coming years, the issue will not only be whether a work is “good.”

We will also need to know whether it is fair.

Fair to the audience.

Fair to creators.

Fair to the people whose voices, styles or data may have fed the system.

Fair to the brand distributing it.

A ghost singer has handed us a mirror

IngaRose, based on the information available today, may not be an artist in the classic sense.

But the phenomenon she represents is real.

A synthetic voice can enter the charts.

A virtual identity can circulate like an artistic identity.

A platform can arbitrate visibility.

An audience can listen without asking many questions.

That is the point that should wake us up.

In music, as in business, AI does not only replace tasks. It shifts the boundaries of trust.

And trust, once blurred, costs far more to rebuild than to protect.

The next time you hear a voice that moves you, you may ask yourself: who is really singing?

The answer will matter.

But the moment when you learn it will matter even more.

References

Picture of Philippe Boulanger

Philippe Boulanger

Philippe Boulanger, international speaker on innovation and artificial intelligence, author, advisor, mentor and consultant.

Latest POSTS

Europe Didn’t Lose on Cost. It Slowed Down.

Europe’s breakdown is not primarily industrial. It is mental. For years, many European players looked at Chinese automakers through an outdated strategic lens: low cost,

Read More »

Hiring Alone Won’t Save Industry

Hiring alone will not save industry. Public debate loves reassuring solutions. Attract more candidates, open more training programs, improve employer branding, streamline sourcing, and then

Read More »

Are you a rule breaker?

You weren’t supposed to find this.

But here you are, because you did what most people don’t: you questioned, you explored, you clicked the thing you weren’t sure you should click.

That’s Innovational Intelligence™ in action.

Most people stay inside the lines. Follow the expected path. Click the obvious buttons. Accept things as they are.

Not you.

You’re one of those rare minds that refuses to accept “this is how it’s always been done.”

We need more people who think like you.

So here’s your reward for coloring outside the lines:

Get VIP pre-release access to the next assessment on Innovational Intelligence™:

You’ll be the first to know when it’s available.

Keep breaking rules. The world needs what you see.