CMP | Concept for Individual Presentation

I have now entered the second half of my Creative Media Practice module, which requires me to “conceive, develop, implement and finally present, a novel narrative media project” (Creative Media Practice Module Guide, LJMU). The idea is to collate all of the methods I have learnt from the past four mini-briefs, and then exploit the best technique(s) to tell a narrative.

I have chosen to do a modern adaptation of the classic fairytale, Cinderella. I will be implementing both split screen and QR codes to enhance the narrative, enabling the audience to see the separate lives of the two main characters. I feel that this can benefit the audience’s entertainment, using Hitchcock’s Theory of Suspense. The audience will know what is happening in Cinderella’s life, which is causing her to leave the ‘Prince’ without notice, causing suspense as the ‘Prince’ is left clueless.

Also, the QR codes allow the audience to see the main characters personal lives through their mobile phones. The narrative is designed so the QR codes can be seen as clues. Without the QR codes, the short film would make sense. However, it is essential for the audience to use the QR code facility, especially to find out the true conclusion.

My adaptation will be called ‘Elle’, starting with a young man and woman dancing at a party. The woman whispers in the mans ear and they both disappear to different locations. The split screen will show him going outside, implementing a QR code that reveals he is texting a friend asking where he was. The second screen will show the woman in the bathroom’s mirror, reapplying her make up.

A QR code exposes an alarm that startles her whilst in the bathroom; the note attached to the alarm simply reads ‘Alex’. She starts to run from the party, fleeing past the man she has spent the night with, dropping her bracelet on the way. He shouts for her, the split screen becoming one as she goes into the distance. He notices she has dropped her bracelet and picks it up.

A series of QR codes and split screens then take the audience on a journey of discovery, learning that she had to flee to give her ill brother his medication. The narrative goes on to show how much the man has fallen for the woman. Putting a picture of the bracelet on Facebook, asking his friends to share and help him find the woman who owns the bracelet. He needs to find her. The woman, on the contrary, is sat at home, caring for her brother. A text comes from her friend directing her to the man. She is overjoyed and adds him on Facebook. The story ends there, with the audience knowing that that’s just the beginning for the new couple.

CMP | Week 4

Last week saw my final mini-breif for the Creative Media Practice module. The session explored the use of annotations in YouTube videos to create a non-linear, interactive narrative. YouTube enables producers to edit ‘boxes’ onto their videos, which can take the user to a complete different website, a different video, or even a different section of the video. Producers have exploited the annotation function in YouTube to create short films and clips that the user can interact and engage with.

The function is used in many different ways. For example, annotated videos can be used as a game, much like the one shown below.

There is now an endless list of possibilities that you can do with annotations, from promotion and story videos, to educational videos. My favourite is shown in the video below.

This has had an excellent effect in the development of narratives, giving producers a greater level of creativity and the ability to engage with the audience. The diagram below is named Freytag’s Triangle. This is from a book Freytag wrote called Technique of Drama, published in 1863. The diagram shows how the main character determines the rise or fall of the plot.

triangle 1

Mark Meadows (Pause and Effect, 2002)

Edgar Allan Poe rearranged this triangle, brining his readers closer to the story. The diagram below shows how this brings us one step closer to interactivity.

triangle 2

Mark Meadows (Pause and Effect, 2002)

With the current technology of today, there a various amounts of diagrams that explain the broad range of narratives authors can now produce. The diagram below, named the Nodal Plot Structure, best symbolises how annotated YouTube videos work. Nodal plots contain a series of non-interactive events, which come to a point of interactive events.

triangle 3

Mark Meadows (Pause and Effect, 2002)

However, my group and I were able to create an annotated YouTube narrative that follows the Open Plot Structure diagram, shown below.

triangle 4

Mark Meadows (Pause and Effect, 2002)

Below is my group and I’s submission for the mini-brief. There are four of us to find, see if you can catch us all!

After playing the YouTube video, you can see how the narrative matches the Open Plot Structure. There are points of decision that carry the user to the next point of decision. Open Plot structures are described to be the most expressive for the user, by Mark Meadows in his book, Pause and Effect (2002).

The interactive video that I experimented with taught me a lot with how to engage a user. Although the narrative takes a lot more planning, filming and editing, it seems to be a worthy technology that I should consider for future productions. I feel he Hide and Seek game conveys clearly how annotated YouTube videos work, as well as creating a fun narrative. Although, I do feel the annotations need to be a little more clear cut. Sometimes when the user clicks to go to a certain section of the video, a frame from a random section is shown, creating an unprofessional feel. If I had more time I would have done this on a much larger scale, possibly across the city of Liverpool. I would get the user into trouble by ‘trespassing land’ within the video, or create comedy by running and falling over an object whilst searching for characters.


Overall, I feel the annotation facility in YouTube is a great, free device that all creative-thinking people should definitely explore.


Production Project Pitch

My final year of University allows me to create a major artefact that has come solely from my ideas and inspirations. Today, I had to pitch my personal idea to the course and its lecturers. I knew that I wanted my project to involve new technology, possibly exploring multi platform, or using different types of software. Narrative wise, I delved in and out of different story lines, from activist campaigns to advertising campaigns.

I finally decided on one that I presented today, with a working title of ‘One in a Million’. It is essentially a  celebration of how extraordinary people are; it will explore how difficult circumstances give people strength and bring them close together. The narrative focuses on a 23-year-old girl, Emma. She was diagnosed with Vasculitis at the age of 19, a disease the affects 1 in a million people, hence the title. Vasculitias literally translates as ‘inflammation of the blood vessels’, her blood vessels become inflamed due to the immune system attacking them. This results in reduced blood flow to the organs, or even blockages, which without treatment, is fatal.

Emma has battled with death three times, leaving her friends, family and husband feeling upset, angry and hopeless. The project will discover how Vasculitis has affected these individuals in their own way. The stories connected together, represented through video, image and artifacts will conclude how the disease has given her, and everyone around her, extremely tough life lessons. In turn, this makes the individuals strong enough to turn something bad into a positive. The idea is for the user to feel empathy towards Emma, and as the user comes to the end of the story, emotional that Emma has come through the other side, promoting the moral to live life to the full.

The narrative will take the form as an interactive, online documentary. Such examples of this include the ‘Pine Point’ website (, and National Geographic’s, ‘Killing Kennedy’ website (!/premiere-screen). The website will contain multiple pieces of content to create emotions and sympathy towards Emma and those around her. This includes: video, images, artifacts, written word, music and voice recordings. The content will be collected together in a chronological order, like a timeline of Emma’s life. Leading from when she was a little girl, climaxing to her near-death experiences, to the resolutions that her and those around her found, right through to the peace and harmony, concluding with her wedding.

My presentation can be seen below, which also includes my inspirations.

(Text taken from my treatment)

CMP | Week Three

This week concentrated on the software Isadora, which creates an interactive platform to enable users to control a narrative. Isadora was originally created by Mark Coniglio, a composer, media artist and programmer. He produced the software as part of a dance performance. The idea was as a dancer moved into a certain area of the stage, then Isadora would pick the action up and react with it.

An example of how the software enhances performances can be seen in the video below.

Future of Memory from troika ranch on Vimeo.

We were able to play around with a trial version of Isadora; I managed to create eyes that follow the cursor on your computer screen as it moves. Other great examples included a sound function, in which as the user speaks, Isadora can hear this and produce an image of moving lips that synch to the users words. Another great idea was the control function. This is where the user can use the arrow keys to navigate to different pieces of narrative.

I feel Isadora is a great new technology that needs to be explored further. The user becomes pulled into the narrative, by selecting and rejecting areas they want to explore further, especially with the control function.

Unfortunately, the trial software doesn’t have the save function, so I was not able to upload it to this blog post. However, my favorite professional use of Isadora is used in this performance shown in the video below.

CMP | Week Two

Week two of Creative Media Practice enabled me to explore second screen technologies. Second screen refers to simply a second screen of content, which compliments the first screen. For example, a person could be watching a television show, such as The X Factor, whilst also playing on X Factor’s play along app on their tablet.

The first TV show to use the second screen was Grey’s Anatomy. The free app came out in 2011, and the video below goes into details.

In the words of Thomas Elsaesser, “… the default value of cinematic storytelling is rapidly becoming that of the interactive video-game and the computer simulation game” (Puzzle Films, Complex Storytelling in Contemporary Cinema, 2009).


Filmmakers have now started to become creative with the way that the audience can use their second screens. For example Prometheus (Ridley Scott, 2012) on Blu-ray, enabled users to explore usual content such as storyboards and concepts for the films. However, the film could communicate with the app so users could see an alternate scene when the time is right whilst watching the film. Users could also flip the alternate screen on to the big television screen, and reverse it when the scene was finished.


When creating two pieces of content to fit together, it’s important to make sure that one screen doesn’t over run the other screen. The user needs to make sense of the two pieces and see the two screens as complimentary to each other. “The second screen material just can’t be filler. It has to be information we want to know and the information has to connect to the character” (Joseph Saroufim,

Our mini-brief this week challenged us to use second screen to enhance a narrative. It was requested we use at least three QR codes (quick response codes) to implement the second screen technology. You can see my team and I’s attempt in the video below, make sure you have your QR scanner at the ready!

Although the QR code is seen as a dying technology, the short film clearly represents how second screen technology can support a narrative. The users can gain a personal connection to the main character, through this technology, as the user is looking at the mobile phone screen that he is using. I feel that the second screen doesn’t interrupt the first screen as we filmed and edited this with this in mind. The same applies for the placement of the QR codes, also. We made sure there was enough time to scan the codes, and put them in an aesthetically pleasing frame.

However, it was very hard to make sure the user would play the content at the same time. I disliked the aesthetic of putting text on the screen to play the content at a certain time, but without it, the second screen would look unprofessional.

CMP | Week One

Last week saw the beginning of Creative Media Practice. This module fits into my final year of study at University and explores new technologies that create innovative storytelling. The two sessions looked at the use of three screens, or in other words, Triptych screening. Triptych can be dated back to the Middle Ages, where people used this form of narrative for worship, as religious iconography.

middle ages

“Spatial montage represents an alternative to traditional cinematic temporal montage, replacing its traditional sequential mode with a spatial one.     Ford’s assembly line relied on the separation of the production process into a set of repetitive, sequential, and simple activities. The same principle made computer programming possible: a computer program breaks a task into a series of elemental operations to be executed one at a time. Cinema followed this logic of industrial production as well. It replaced all other modes of narration with a sequential narrative, an assembly line of shots, which appear on the screen one at a time”

- Lev Manovich,

Triptych first made an appearance in cinema in the early 1900’s with films such as Suspense (Lois Webber, 1913) and later came Napoléon (Abel Gance, 1927).


My favourite example of Triptych from the past comes from a title sequence created by Saul Bass with the Grand Prix of 1966. You can watch the sequence here:

grand prix

Split-screen narrative gives the storyteller advantages to not only give the audience enhanced information in a stylistic way, but it may also be used to confuse the narrative. Much like the film Sisters (Brian de Palma, 1973), which tells the story of twin sisters with opposite personalities. The film concludes that the screens were actually telling the story of just one girl, but with a split personality.


Tryptich is now used today in numerous media platforms, from the likes of graphics, TV, cinema and the web. My favourite piece is HBO Voyeur (2007).

hbo voyeur

The session led to the start of one of four mini-briefs. This was to go out and film and edit a Tryptich screen. Myself and two others went on to present the video below the following day.

I like the 180 degree effect that the Triptych gives the audience, I can see this in a nightclub behind a DJ. As well, in an exhibition with the video projected onto walls, and the images on the left and on the right projected on walls that are at an angle to the middle screen, incasing the audience into the narrative.

However, the wobbling in the frames, I feel, makes the Triptych look out of sequence. Either a tripod will have to be used in the future, or it would be interesting to explore if the wobbling was to be increased, what effect it would have on the narrative. Furthermore, with more time it would have been nice to have increased the narrative, using each screen as an individual which then concluded and met up at the end.

Let the Fun Begin

I’ve been working for Liverpool SU for almost a month now, as a Digital Content Producer and as Marketing Supervisor for Byrom Street campus. Even though it’s been work, work, work, I’ve been having such a good time. Meeting and chatting to new people and engaging with all the events the SU has to offer, from Laser Quest, to filming club events, I’ve thoroughly enjoyed myself. It hasn’t come without its hard work, though! There’s nothing better than loving what you do.

I don’t officially start University until tomorrow, however, and that’s when the real fun begins. At the moment I’ll be concentrating on a report from my time at Lime Digital, creating ideas for a Production Project, and creating an Interactive Exhibition for a museum. I’ll keep you posted! I can’t wait to get back into the swing of things and get back into designing.


Digital Content team trying out Laser Quest before the shoot.