My Reflection of London rehersal

My Reflection of London rehersal

In this spring break, I had a chance to work with an orchestra based in London and we had a rehearsal where the soundtrack that I made for the BIFSC competition in 2025 and 2026 made it to the list of pieces of music that the orchestra will play in future performances.

I really felt the difference between a digital soundtrack and real performance after that. Actual performances needed much more communication and organization than I thought it would be, and it requires much more time or experience.

I communicated with members of the orchestra and took many valuable feedback from them which did change the way I think when I attempt to create music for real performers, and the experience also gave me much more new knowledge about the benefits and limitations of human performance than computer sample libraries.

According to the feedback I received, I can identify 2 problems in the soundtrack I made. 1, many instruments like violin and cello are only able to play one note at a time which I did not consider when writing. 2, the dynamics I wrote must be communicated clearly through the sheet music. My sheet music contains mistakes that cause people having a hard time understanding, such as multiple chords in the same section of instruments which cannot be played, repeated dynamics that can be confusing and I also didn’t consider all the rests that can be hard to count. I made some changes to both the 2025 and 2026 soundtrack.

For the 2025 soundtrack, I first removed all the dynamics. Many of these are written in a way that is the easiest for me to edit and for the mp3 feature of Musescore to play, but is really hard for anyone to actually understand. I changed them to what I can personally sight-read and moved them apart to avoid them being overly crowded. This is what it looks like now.

 I then added proper crescendo and decrescendo to the sheet music.

To solve the problem of triads being unable to be played by a single player on a violin, I separated the chords so 3 players can play them, each playing one note so it still sounds the same realistically. For example, if I wanted a C# major chord, instead of putting it in one section, I separated it to C#, F, A♭for 3 instruments.


For the 2026 soundtrack, I did the exact same and also organized most of the rests to quarter rests so people can easily count to the beat (it was multiple semiquaver rests before.) 

As the 2026 soundtrack used more instruments, I occasionally duplicated a few instruments so I can see how many layers and also help me time the score to the short animation. After feedback from performers about difficulty from reading 2 rows at the same time, I combined the two rows, especially the harp. Though I am still trying to understand harp notation, I made the improvement of removing repeated notes that cannot be played in real life after observing the performer and the instrument itself.

I then finished editing the 2026 soundtrack by removing empty bars. In my original soundtrack, I left multiple long blanks as a pause so the music goes better with the film’s sound design. I expected performers to skip these bars with ease, but after actually rehearsing, they did not easily count every bar of the silent part, causing a big mess up after the part because everyone counted different speeds. The soundtrack is now directly connected to the next part without the rest, or only left one or two bars of rest so it is the easiest for people to time the start of the next section.

For my future work, I would avoid my previous mistakes and pay extra attention to each instrument’s limitations, so my music can work both as a soundtrack and be performed by real people.