'Happy dance' — Just type text prompts to create your own music with Adobe's new Gen AI tool (original) (raw)
HomeTechnology News'Happy dance' — just type text prompts to create your own music with Adobe's new Gen AI tool
By inputting text prompts such as 'powerful rock,' 'happy dance,' or 'sad jazz,' users can seamlessly generate music tailored to their preferences with Adobe's experimental Project Music GenAI Control.
Adobe, on Wednesday, February 28, previewed an experimental project set to redefine how individuals craft custom audio and music. Dubbed 'Project Music GenAI Control,' this early-stage tool uses generative AI to enable creators to not only generate music from text prompts but also exercise precise control over the editing process to tailor the audio to their exact specifications.
Nicholas Bryan, Senior Research Scientist at Adobe Research, and one of the creators behind the technology, elaborated on the transformative potential of Project Music GenAI Control in a blog post. "With Project Music GenAI Control, generative AI becomes your co-creator. It empowers individuals across various domains, from broadcasters to podcasters, to craft music that perfectly encapsulates the desired mood, tone, and duration," Bryan said.
According to Adobe, Project Music GenAI Control follows in the footsteps of Firefly, Adobe's family of generative AI models for image generation, which is said to have generated over six billion images to date. The company also reaffirms that all content produced using Firefly includes Content Credentials— essentially ‘nutrition labels’ for digital content — ensuring accountability, responsibility, and transparency in content usage.
The cornerstone of the new tools lies in the text prompt-based approach, a methodology used in Adobe's Firefly. By inputting text prompts such as ‘powerful rock,’ ‘happy dance,’ or ‘sad jazz,’ users can seamlessly generate music tailored to their preferences. Subsequently, the integrated fine-grained editing capabilities allow users to manipulate various aspects of the generated audio directly within the workflow.
From adjusting tempo and structure to looping sections and remixing compositions, the tool provides unparalleled flexibility to meet diverse creative needs, Adobe said.
Bryan also highlighted the significance of these tools, likening them to a "pixel-level control for music," enabling creatives to shape, refine, and mould audio with granularity.
Project Music GenAI Control is currently being developed collaboratively with experts from the University of California, San Diego, and Carnegie Mellon University.
Also Read: Apple’s AI future to be shared later this year, hints Tim Cook
First Published:
Feb 29, 2024 7:50 PM
IST