Software is being developed at UL for jingles and music for ads, raising interesting copyright issues, writes Anna Nolan
A few clicks on the computer is now all you need to begin composing original music for video clips, television advertisements, promotions and documentaries. The computer programme enabling this was developed by researchers in the Interaction Design Centre (IDC) at the University of Limerick.
The video-driven composition system, originally called MetaMusic but now known as Abaltat Muse, works by detecting events on the video and automatically generating suitable music to accompany them.
The package was developed in consultation with several video editors, explained UL lecturer, researcher and course director Mikael Fernstrom. "We needed to discover the vocabulary used by the editors, and explore their mental models," he said.
The IDC is part of UL's department of computer science and information systems. It is an interdisciplinary research group that works on the design, use and evaluation of information and communications technologies.
The UL Abaltat Muse development team is made up of postgraduate students who between them have an array of programming and musicology skills - artificial intelligence (AI), artificial neural networks, music technology, video analysis, graphical user interface (GUI) expertise, software engineering and more, says Fernstrom.
They built a cognitive model of the minds of music composers in various genres, determined the rules by which they composed, and taught these to the computer package using artificial neural networks.
They also developed another cognitive model for analysing the video stream the way a composer would, by concentrating on colour and movement, he says.
So far, they have produced three musical genres - atmospheric, rhythm 'n' blues and baroque - and more will be added.
"We took 150 baroque compositions and analysed them for patterns and motifs," explains IDC team member, Ian O'Keeffe. "The atmospheric genre is based on trance music, which was much easier to analyse than Bach."
Analysing the video in a suitable manner was a major challenge.
"At first we had no way of mapping the video to the music, but we made a three-dimensional model focused on the colour content, so we could for example, follow a blue car," says O'Keeffe's IDC colleague Tony O'Callaghan.
Intrinsic to the process is a graphical representation that Mr O'Callaghan has called a correlogram, because it correlates the probabilities of two adjacent pixels being of the same colour.
"There are other packages available, but not like ours," states Eoin Brazil, another member of the IDC.
"It's a middle tool, slotting in between stock music and composition tools that need a great deal of experience."
For a given video clip, a human video editor chooses from menus for music genres, tempos, number of beats per second, and complexity. The package analyses the varying amounts of different colours in the video frame by frame, and produces the musical soundtrack to go with it.
This music can then be used as it is, or varied by the editor introducing changes of key or changing the instruments. There is no need to involve a human composer, so the music is royalty-free. At all times, this soundtrack is under the control of the video editor.
Alternatively the music and video can be given to a human composer as an indication of the type of music needed and the timing required.
The IDC's software prototype has been transferred to a new Irish start-up company in Meiticheol Teoranta, trading from Spiddal, Co Galway, as Abaltat. The university owns some of the intellectual property exclusively, and some of it is shared with the company.
The package is expected to be on the market shortly. The development was sponsored by Enterprise Ireland's Innovation Partnership