Upcoming events

CSIS Design, Art and Technology Seminar Series 2018 –  we are pleased to announce the talk by the GASH COLLECTIVE.

Everyone is welcome. 

Details:
Speaker: Laura O Connell / Ellen King
Date: Wednesday Feb 14th 3-4.30pm
Venue: CSG001

GASH COLLECTIVE acts as  platform to encourage women to create, share and collaborate together in the field of electronic music in Ireland. Through safe space initiatives, carefully curated parties and events, as well as production and DJ workshops, GASH intends to shine a light on female identifying producers and DJs in Ireland.

GASH COLLECTIVE was founded on International Womens Day 2016. Inspired by Female:Pressure’s VISIBILITY project and like minded collectives such as SIREN London and DISCWOMAN NYC.

 

 

 

Lightmoves – Festival of Screendance – 2018

Light Moves presents feature screenings, short films, invited works and open submissions, as well as a Screendance Lab and Symposium with some of the most respected figures in the field. The Light Moves Screendance Symposium encourages artistic and scholarly exchange, debate and discussion in screendance and related disciplines including performance, dance, film, visual arts, sound/music and text.

Light Moves is curated by DMARC faculty Jürgen Simpson and Mary Wycherley and produced by Dance Limerick, in partnership with DMARC (Digital Media and Arts Research Centre) at University of Limerick. The festival is supported by the Arts Council, Limerick City and County Council and the JP McManus Fund.

More info here – http://www.lightmoves.ie/

Wednesday February 29th, 2012

Nick Collins

3.00pm Computer Science Auditorium (CSG-01)

Machine listening and learning for musical systems

Musical articial intelligences are playing an important role in new composition and performance systems. Critical to enhanced capabilities for such machine musicians will be listening facilities modeling human audition, and machine learning able to match the minimum 10000 hours or ten years of intensive practice of expert human musicians. Future musical agents will cope across multiple rehearsals and concert tours, or gather multiple commissions, potentially working over long musical lifetimes; they may be virtuoso performers and composers attributed in their own right, or powerful musical companions and assistants to human musicians.

In this seminar we’ll meet a number of projects related to these themes. The concert system LL will be introduced, an experiment in listening and learning applied in works for drummer and computer, and electric violin and computer. Autocousmatic will be presented, an algorithmic composer for electroacoustic music which incorporates machine listening in its critic module. Large corpus content analysis work in music information retrieval shows great promise when adapted to concert systems and automated composers, and the SuperCollider library SCMIR will be demonstrated, alongside a new realtime polyphonic pitch tracker.

biog:

Nick Collins is a composer, performer and researcher in the field of computer music. He lectures at the University of Sussex, running the music informatics degree programmes and research group. Research interests include machine listening, interactive and generative music, and audiovisual performance. He co-edited the Cambridge Companion to Electronic Music (Cambridge University Press 2007) and The SuperCollider Book (MIT Press, 2011) and wrote the Introduction to Computer Music (Wiley 2009). iPhone apps include RISCy, TOPLAPapp, Concat, BBCut and PhotoNoise for iPad. Sometimes, he writes in the third person about himself, but is trying to give it up. Further details, including publications, music, code and more, are available from http://www.cogs.susx.ac.uk/users/nc81/index.html