As has become a tradition here at CatSynth, we present our end-of-year image.
[Click to enlarge.]
It was a bit of a challenge to decide what to put in, as there were so many this time. But I think these are particularly representative. And it’s also significant that it is more colorful than previous end-of-year images.
The first few days of this year were quiet and a bit dark. That changed quickly, with tumultuous events around the world, and new experiences close to home. It’s the year I finally had a photography show, and by the end of the year I had several. There were new surprising types of performances and the costumes to go with them. I deepened my connections back in New York with friends, music, art and the landscape. And I no idea what I would have the chance to participate in something like the Occupy movement . There were many sad moments as well, with the loss of friends.
In all, 2011 has been particularly rich and productive, if sometimes a bit chaotic. If one had told me at the end of 2007 or 2008 (or 2001 for that matter) that this is what life would be like now, I would have been pleasantly surprised. There is a sense, however, that the patterns of this past year are not sustainable. This will have to be part of the plan for 2012, in particular getting organized, staying healthy and trying to make good choices. We will see how that unfolds as the new year progresses…
Happy New Year and thank you for all the support and warmth from those who read these pages!
This is an instructional demo from the blog: Http://SynthandI.blogspot.com
The first half of this explains what is needed to connect an IPad running a midi controller App (in this case TouchOSC) to the Buchla 200e modular synthesizer. The second half of this video show a bit about the patch I made in TouchOSC and discussed the possibilities for further exploration.”
The Outsound Music Summit began this Sunday with the annual Touch the Gear Expo. Visitors have a chance to see and try out the equipment used by musicians and sound artists. We had a a diverse group of participants this year, and this short video gives a good overview of some of the sound and visuals that one would have encountered:
We had a decently sized turnout for the event, and the evening went by quickly. While not at my own station, I did my best to see others work, but did not get to everyone. For those who followed my live tweets from the event, the remainder of article might seem redundant, but I do provide more detail.
I brought a small rig that reflects my recent solo work, with an iPad as both a synthesizer and controller for software on the laptop, a monome, the Wicks Looper and a Korg Mini-Kaoss Pad.
The iPad was primarily running TouchOSC, controlling a version of my piece Charmer:Firmament running in Open Sound World on the laptop, as well as a few popular instruments like the Smule Magic Fiddle and Bebot. The monome was controlling sample loops, and the Wicks Looper was feeding into the Kaoss Pad.
Next me, Matt Davignon presented a turntable and effects pedals that was quite popular with visitors. There is still something compelling about a tactile and intuitive interface such as a turntable that compels people to want to play it. In contrast, the monome in particularly seemed to intimidate people.
There were many non-electronic offerings as well, including the quartz cantabile by Todd Larew. Who needs electronics when you have fire as your primarily technology!
Bob Marsh wandered the hall in a suit covered in plastic water bottles, some containing mechanical sound generating elements, and was quite a presence throughout the evening.
He also brought several other articles of sonic clothing for people to try on and play.
Tim Thompson brought his space palette, a large wall-sized controller in which one controls sound and visuals by moving in the various spaces in the panel.
I had seen him perform with the space palette before, but this my first opportunity to try it out myself.
Another original instrument, the Ernestophone, featured one main string and several sympathetic strings, and a very rich sonic palette of overtones.
Phogmasheen presented an instrument made from pick heads and cake pans.
One strikes the metal elements with mallets or sticks, and then pickups process the output electronically.
This is not the first time I have seen a classic 1950’s HP oscillator at Touch the Gear, but it’s the first time I have seen one paired with a Peerless transistor radio, for a very retro noise experience.
Noise rigs are a common theme, particularly chains of effects pedals and mixers that operate solely on the noise inherent in electronic circuits but then amplify and shape it through non-linear processes of the effects change into rich and chaotic sound palettes. One example is this colorful rig from CJ Borosque. I was able to get subtle an expressive control of the sound by focusing on only a couple of knobs.
Other participants included Tom Nunn presenting one of his sonic inventions, Rick Walker demonstrating high virtuosic use of live-looping hardware and Laurie Amat getting rather humorous results from the sound of the crowd in the hall processed through a classic green Line6 delay pedal.
The panel discussion on Monday night, entitled “Elements of non-idiomatic compositional strategies” was quite a contrast to Touch the Gear Night. Four composers, Kanoko Nishi, Andrew Raffo Dewar, Krystyna Bobrowski, and Gino Robair engaged in a discussion moderated by Polly Moller about their music, influences and views on composition in front of an intimate audience with plentiful wine, cheese and dark chocolate.
One of the interesting questions was whether each of the composers began their ideas with sound, or a focus on sound. Not surprisingly, the answer was no – although sound was the medium of creativity, the source ideas can come from anywhere. In speaking about his piece for the Friday concert at the summit, he described how the work was influenced very directly by paintings by the Argentine artist Eduardo Serón. Gino Robair similar painted a very visual and conceptual influence for his suite based on the engravings of Jose Guadalupe Posada of late19th -and early 20th-century life in Mexico, and the skeletons and skulls in particular. Kanoko Nishi referred “music completely devoid of symbols”; and Krystyna Bobrowski described her work with her created instruments as a “sonic bloom of resonance”, perhaps my favorite phrase of the evening.
Other topics discussed included composing for instruments or sounds versus composing for particular musicians, i.e., “instead of preparing the piano, prepare the pianist” (as I pianist, I am not sure how I feel about being prepared), and questions about the rewards of composing experimental music – because it was accepted by panelists and audience alike that their are neither financial nor sexual riches to be gained by this pursuit. Perhaps the response that rang most true to me was that composing music is an obsessive-compulsive activity that some of us just have to do whether we like it or not.
For those who not familiar with the terms, think of idiomatic music as music that falls into recognizable patterns and genres that one can readily identify, so non-idiomatic music is music that attempts to defy such categorization. However, I often find the dichotomy not particularly useful. I sympathize with the composers’ desire to two work that transcends past categorization, and I often strive to do the same thing – but we can’t help but be influenced by the music and sounds around us, and shouldn’t necessarily fear the appearance of these influences in music that we call “new”. It was also interesting how much all four panelists distanced themselves from mathematics, even while acknowledging the deep and longstanding interconnection with music.
Today we look back on my solo concert at the Center for New Music Technologies (CNMAT) at U.C. Berkeley back in early March. It was part of my U.C. Regents Lecturer appointment this year, which also included technical talks and guest lectures for classes.
This is one of the more elaborate concerts I have done. Not only did I have an entire program to fill on my own, but I specifically wanted to showcase various technologies related to my past research at CNMAT and some of their current work, such as advanced multi-channel speaker systems. I spent a fair amount of time onsite earlier in the week to do some programming, and arrived early on the day of the show to get things set up. Here is the iPad with CNMAT’s dodecahedron speaker – each face of the dodecahedron is a separate speaker driven by its own audio channel.
[click image for larger view.]
Here is the Wicks Looper (which I had recently acquired) along with the dotara, an Indian string instrument often used in folk music.
[click image for larger view.]
I organized the concert such that the first half was more focused on showcasing music technologies, and the second half on more theatrical live performance. This does not imply that there wasn’t strong musicality in the first half or a lack of technological sophistication in the second, but rather which theme was central to the particular pieces.
After a very generous introduction by David Wessel, I launched into one of my standard improvisational pieces. Each one is different, but I do incorporate a set of elements that get reused. This one began with the Count Basie “Big Band Remote” recording and made use of various looping and resampling techniques with the Indian and Chinese instruments (controlled by monome), the Dave Smith Instruments Evolver, and various iPad apps.
The concert included the premier of a new piece that was specifically composed for CNMAT’s impressive loudspeaker resources, the dodecahedron as well as the 8-channel surround system. In the main surround speakers, I created complex “clouds” of partials in an additive synthesizer that could be panned between different speakers for a rich immersive sound. I had short percussive sounds emitted from various speakers on the dodecahedron. I though the effect was quite strong, with the point sounds very localized and spatially separated from the more ambient sounds. In the video, it is hard to get the full effect, but here it is nonetheless:
The piece was implemented in Open Sound World – the new version that primarily uses Python scripts (or any OSC-enabled scripting language) instead of the old graphical user interface. I used TouchOSC on the iPad for real-time control.
I then moved from rather complex experimental technology to a simple and very self-contained instrument, the Wicks Looper, in this improvised piece. It had a very different sound from the software-based pieces in this part of the concert, and I liked the contrast.
The first half of the concert also featured two pieces from my CD Aquatic: Neptune Prelude to Xi and Charmer:Firmament. The original live versions of these pieces used a Wacom graphics tablet controlling OSW patches. I reimplemented them to use TouchOSC on the iPad.
The second half of the concert opened with a duo of myself and Polly Moller on concert and bass flutes. We used one of my graphical score sets – here we went on order from one to the next and interpreted each symbol.
The cat one was particular fun, as Polly emulated the sound of a cat purring. It was a great piece, but unfortunately I do not have a video of this one to share. So we will have to perform it again sometime.
I performed the piece 月伸1 featuring the video of Luna. Each of the previous performances, at the Quickening Moon concert and Omega Sound Fix last year, used different electronic instruments. This time I performed the musical accompaniment exclusively on acoustic grand piano. In some ways, I think it is the strongest of the three performances, with more emotion and musicality. The humor came through as well, though a bit more subtle than in the original Quickening Moon performance.
The one unfortunate part of the evening came in the final piece. I had originally done Spin Cycle / Control Freak at a series of exchange concerts between CNMAT and CCRMA at Stanford in 2000. I redid the programming for this performance to use the latest version of OSW and TouchOSC on the iPad as the control surface. However, at this point in the evening I could not get the iPad and the MacBook to lock onto a single network together. The iPad could not find the MacBook’s private wireless network, even after multiple reboots of both devices. In my mind, this is actually the biggest problem with using an iPad as a control surface – it requires wireless networking, which seems to be very shaky at times on Apple hardware. It would be nice if they allowed one to use a wired connection via the USB cable. I suppose I should be grateful that this problem did not occur until the final piece, but was still a bit of an embarrassment and gives me pause about using iPad/TouchOSC until I know how to make it more reliable.
On balance, it was a great evening of music even with the misfire at the end. I was quite happy with the audience turnout and the warm reception and feedback afterwards. It was a chance to look back on solo work from the past ten years, and look forward to new musical and technological adventures in the future.
Several pieces are going to feature the iPad (yes, the old pre-March 2 version) running TouchOSC controlling Open Sound World on the Macbook. I worked on several new control configurations after trying out some of the sound elements I will be working with. Of course, I have the monome as well, mostly to control sample-looping sections of various pieces.
One of the main reasons for spending time on site is to work directly with the sound system, which features an 8-channel surround speaker configuration. Below are five of the eight speakers.
One of the new pieces is designed specifically for this space – and to also utilize a 12-channel dodecahedron speaker developed at CNMAT. I will also be adapting older pieces and performance elements for the space, including a multichannel version of Charmer:Firmament. In addition to the multichannel, I made changes to the iPad control based on the experience from last Saturday’s performance at Rooz Cafe in Oakland. It now is far more expressive and closer to the original.
I also broke out the newly acquired Wicks Looper on the sound system. It sounded great!
The performance information (yet again) is below.
Friday, March 4, 8PM
Center For New Music and Audio Technologies (CNMAT)
1750 Arch St., Berkeley, CA
CNMAT and the UC Berkeley Regents’ Lecturer program present and evening of music by Amar Chaudhary.
The concert will feature a variety of new and existing pieces based on Amar’s deep experience and dual identity in technology and the arts. He draws upon diverse sources as jazz standards, Indian music, film scores and his past research work, notably the Open Sound World environment for real-time music applications. The program includes performances with instruments on laptop, iPhone and iPad, acoustic grand piano, do-it-yourself analog electronics and Indian and Chinese folk instruments. He will also premier a new piece that utilizes CNMAT’s unique sound spatialization resources.
The concert will include a guest appearance by my friend and frequent collaborator Polly Moller. We will be doing a duo with Polly on flutes and myself on Smule Ocarina and other wind-inspired software instruments – I call it “Real Flutes Versus Fake Flutes.”
The Regents’ Lecturer series features several research and technical talks in addition to this concert. Visit http://www.cnmat.berkeley.edu for more information.
Here Luna poses with TouchOSC on the iPad, which is becoming one of the main control surfaces I will be using to control Open Sound World. Last night I was building the synthesis infrastructure for the new piece, a combination of drum sampling and spatialized additive synthesis – at least four separate additive synthesis models that are algorithmically generated based on input from the iPad. Against this will be electronic drum sounds and an Afro-Cuban rhythm detail. I really won’t know the exact shape of this piece until I work with CNMAT’s speaker array.
I also learned from the Saturday’s performance in Oakland that I will need to refine the control on TouchOSC for the new implementation of my piece Charmer:Firmament. It was very well received, with descriptions like “beautiful” and “meditative”, but it was difficult to control compared to the Wacom graphics tablet. I will try a different mix of controls on the iPad to see if it works better.
I have been busily preparing this weekend for the first of my UC Berkeley Regents’ Lecturer presentations:
Open Sound World (OSW) is a scalable, extensible programming environment that allows musicians, sound designers and researchers to process sound in response to expressive real-time control. This talk will provide an overview of OSW, past development and future directions, and then focus on the parallel processing architecture. Early in the development of OSW in late 1999 and early 2000, we made a conscious decision to support parallel processing as affordable multiprocessor systems were coming on the market. We implemented a simple scalable dynamic system in which workers take on tasks called “activation expressions” on a first-come first serve basis, which facilities for ordering and prioritization to deal with real-time constraints and synchronicity of audio streams. In this presentation, we will review a simple musical example and demonstrate performance benefits and limitations of scaling to small multi-core systems. The talk will conclude with a discussion of how current research directions in parallel computing can be applied to this system to solve past challenges and scale to much larger systems.
You can find out more details, including location for those in the Bay Area who may be interested in attending, at the official announcement site.
Much of the time for a presentation is spent making PowerPoint slides:
With slides out of the way, I can now turn to the more fun part, the short demos. This gives me an opportunity to work with TouchOSC for the iPad as a method for controlling OSW patches. We will see how that turns out later.