Never Supposed to Be”: How I Made My First AI-Generated Music Video

When I started working on “Never Supposed to Be,” I didn’t set out to make a music video.

I was chasing a feeling — the soft melancholy of a failed relationship, wrapped in a hazy, surreal daydream, and dressed in the faded tones of 1960s British pop. What I ended up with was not only a song, but my very first music video — and I made it all with the help of artificial intelligence.

The Song That Sparked It All

The song itself is a kind of post-mortem. It’s told from the perspective of someone looking back at a relationship that never really stood a chance. There’s resignation in the lyrics, but also a bitter honesty — like reading a love letter you never sent. I wrote the lyrics and composed the track using Suno.ai, which gave me the space to experiment freely with melody, harmony, and tone. I went through several versions before I found the sound I was after — something emotionally raw but melodically nostalgic.

From Sound to Vision

Interestingly, it was the vibe of the song that inspired the visuals. I kept imagining scenes that felt like a dream you can’t fully wake up from: soft colors, strange juxtapositions, slightly off-kilter — like a 1960s music video that melted into a painting. Using Midjourney, I started prompting those images into existence. Some were still images, others became animated sequences. The surrealism came naturally through the AI’s visual imagination, and I embraced the unexpected.

Stitching It All Together

Once I had the material, I edited everything in KineMaster — a mobile-friendly video editing app. That’s where I learned to pan and zoom subtly, to bring motion to still images, and to pace the cuts with the song’s dynamics. I also had to figure out what clips fit together — not just visually, but emotionally.

At this point, it felt like the video was building itself, almost like the song had left a trail of images behind it, and I just had to follow.

Challenges? Of Course.

This was all new terrain for me. I had to teach myself how Midjourney’s video loops work, how to generate cohesive scenes, and how to match the movement of visuals to music. I also learned that less is often more: a gentle zoom, a slow fade, the right kind of color tone — these small things made a big difference.

Why AI?

Because it gave me tools I didn’t have before. I’m not a trained animator. I don’t have a video crew. But I do have stories to tell and feelings to express — and now, I have access to creative tools that can help me do it, even without a big budget or technical background.

Final Thoughts

I didn’t expect to release a music video. But this song almost demanded one. It originated from a specific emotional place, and through the use of AI tools and extensive trial and error, I found a way to bring it to life visually.

“Never Supposed to Be” is now out — and I’m proud to say it’s mine, every pixel and note of it, even if some of it was whispered to me by machines. You can watch the video here, and I’d love to hear what it evokes in you.