Model Call Girl in Tilak Nagar Delhi reach out to us at 🔝9953056974🔝
Project Proposal - Digital Media
1. Candidate No. 107938
1
Digital Media Project Proposal
We live in a society where we are constantly surrounded by mobile phones. A
‘media-rich environment, in which digital technologies are proliferating faster than
our cultural, legal, or educational institutions can keep up with them.’ (Bolter And
Grusin, 2000, p.5). Mobiles have become an essential piece of technology in
establishing and shaping the world of ubiquitous computing we live in today, so
much so that we have become conditioned to the devices. Although smartphones
are one of the most technologically advanced commodities in the world as we know
it today, it hasn’t always been that way. My MAX application will attempt to explore
and document the changes in technology and their cultural value from when Martin
Cooper of Motorola made the first publicized handheld mobile phone call in 1973
until today, investigating smartphones, and the two current market leaders in mobile
technologies Apple and Samsung.
My project aims to provide a multimedia narrative, which engages with the history of
the mobile phone and explored its social impact within the world. I aim to create an
application, which allows the user to engage in a two-way interaction between
human and computer, educating themselves about the history of mobile devices. I
want to create a sensory experience that is usually disassociated with history
through a seamless and responsive interaction design process between user and
program, enriching the users life through interaction with technology and historical
artefacts. I aim to create an interactive application, which will document the
evolution of the mobile phone from when they were invented up until the present
day, this creates an achivable record of the history of the mobile phone as an
artefact that can be looked back on and will hold cultural value in the future. I will
accomplish this through investigating and experimenting with the software ‘MAX
MSP’, a powerful tool which allows for the development of interactive media, new
media and music programming.
2. Candidate No. 107938
2
I have had a couple of ideas about how I am going to explore this theme through
the software. The first of my ideas will show the evolution of the mobile phone
through a quiz. The quiz will become interactive by using buttons on the keyboard
to trigger points for the player. Specific buttons on the keyboard could be
programmed to correlate to correct and incorrect answers to a question about the
history of the phone, which is shown on the screen. Once the correct answer is
triggered, a video will play which has been nested in the interface to the right of the
question (see figure 1). The questions could be on a countdown timer which would
make the application a little more interactive as it would engage the audience more,
knowing that they are under pressure to choose the right answer before the
countdown reaches 0. I could easily create a button/buzzer and scoring system
within MAX, which triggers a video to be played, but moving the quiz between
questions and essentially resetting the interface for each question could potentially
become a problem. An alternative input method could be the use of fiducials, which
correspond to numbers, instead of the use of keys/numbers. I don’t think this idea is
entirely feasible as the level of knowledge needed to program this properly and
make it into a functioning application is beyond my ability.
This idea could be further adapted into a museum setting by creating an interactive
quiz which would be situated at the end of an exhibit. This idea could be bought to
life by displaying the questions on a screen at an eye level viewpoint and swapping
the buttons on the keyboard as an input method to the use of interactive touch
pads. Once the question is displayed, the user will select the option (displayed on
touch pads) which they think answers the question correctly. This idea comes from
the ‘Storytelling With Sound’ interactive exhibit at the Walt Disney Family Museum,
San Francisco, California. (http://secondstory.com/project/storytelling-with-sound)
My second idea further explores the history of the mobile phone through a webcam
based interactive application, and is the idea which I have chosen to explore though
my developed concept idea. The application will feature multiple video streams
embedded within the space, allowing the narrative to be told through the medium
3. Candidate No. 107938
3
of video. Using video as a conveyor of content means that the audience will stay
engaged and interested in the information being shown as it is visually and audibly
stimulating. The application will use the computer software reacTIVision, “an open
source, cross-platform computer vision framework for the fast and robust tracking of
fiducial markers attached onto physical objects, as well as for multi-touch finger
tracking. It was mainly designed as a toolkit for the rapid development of table-
based tangible user interfaces (TUI) and multi-touch interactive surfaces”
(http://reactivision.sourceforge.net/).
The application will work by queuing different videos around the screen, which will
be layed out like a branching mind map (see figure 2). The videos used in my
interactive application will convey information about different types of phones
throughout their invention. The videos themselves be remediated from YouTube or
other video hosting websites. In an ideal situation I would like to be able to embed
YouTube videos, but limitations and complications of the software mean that this is
almost impossible. I intend to reflect the current popular design aesthetic of ‘flat
design’, which has been seen across new media and in Apple’s latest mobile
operating system iOS7. I have chosen to display multiple video screens at once, to
imitate the hypermedia design of the Windows operating system with its multiple
panels and screens. Further supporting the underlying theme of hypermediacy, I
plan to include lots of smaller looping videos or GIF’s which will fill in the blank
space, giving it a post-modernist feel.
The interaction within the application is through the user engaging with the
fiducials, or the objects which would be on display if the concept was further
developed for a museum setting. Interaction can also be through the interface itself
by including hyperlinked text or URL’s to webpages of interest which would further
help the story to be told. Including hyperlinks in this way means that the narrative
will be able to flow in new directions, completely decided by the user. The use of
multiple fiducials means that the users of the application can pick and choose what
artefacts they are interested in knowing more about. The application will operate
4. Candidate No. 107938
4
using Lister’s ideas of hypertextual navigation, as discussed in New Media: A Critical
Introduction (2009). The user will make use of the ‘computer apparatus and software
to make reading choices in a database’ (p.22). The database in this case being the
group of mobile phones I have chosen to represent which illustrate the evolution of
the mobile phone. The use of hypertextual navigation thoughout the database
means that the narrative of the application is not linear; instead it is multi directional.
Using multiple fiducials act in a similar way to hyperlinks, steering the story in
multiple directions and allowing the creation of paths that ‘do not necessarily follow
routes and destinations entirely generated by the story’s creator’ (Alexander and
Levine, 2008). The addition of hyperlinks also adds to the underlying theme oh
hypermedia by enabling integration with the internet.
Action and reaction is at the heart of any interactive design process. Manovich
(2001) suggests that ‘a program reads in data, executes an algorithm, and writes out
new data’ (p. 198). This idea will be executed within my application through
interactions between the webcam and the use of the fiducial markers and the
reacTIVision software; which will trigger a series of algorithms within the computer,
eventually causing a video to play within its specified window. Within the
application, no physical interaction from the user is needed with the interface to
convey the main education content of the application. The use of the fiducials
means that the message can be conveyed without even clicking a button. If the user
wants to find out more about a certain element of the history of the mobile phone,
they can make the physical interaction with the interface by choosing to engage
with the hyperlinks which I have already discussed. Manovich (2001) suggests that
‘creating a work in new media can be understood as the construction of an interface
to a database’ (p.200). The application will operate on Manovich’s idea of a
database system, meaning users can select specific information within the
application which can be accessed almost instantaneously. Within the application,
the narrative is not fixed, allowing visitors to cue the information they are interested
in, making the content specific to them.
5. Candidate No. 107938
5
The inspiration for this application came from reflecting on my visit to the Walt
Disney Family Museum in San Francisco, in 2012. (http://www.waltdisney.org/)
When I visited the museum I was really impressed with the level of interactivity
throughout the museum and how narratives were told through sound and interactive
multi screened displays (see figures 3, 4 and 5). I took inspiration from the exhibits at
the museum and tried to visualise an application, which would work in a museum
setting. Once I had developed a concept, I worked my way backwards to thinking of
how I could simplify it to create a prototype version using MAX MSP.
As a prototype, there will be no physical interaction with the actual objects. I plan to
attach the fiducials to images of the phones, representing the actual physical
phones which I would provide as part of the application if it was to be developed
further to fit into a museum setting. Its digital medium allows for easy access and
manipulation so the application can be applied to different contexts and settings. If
the application could be developed for use in an interactive installation within a
museum, there would be screens built into a wall similar to the ‘Steamboat Willie
Wall’ at the Walt Disney Family Museum (Figure 5). A box of artefacts (phones from
different time periods) would be provided so that the subject could physically
interact with objects from the past. These objects would have the fiducials fixed to
them so the user could hold the artefact up to a webcam built into the same wall.
The webcam would recognise the objects and then a video would play on one of
the screens to explain why that phone is significant and what made it different.
In a museum or exhibition setting, the fiducials could be swapped with QR codes,
therefore allowing the objects to be recognised and read by mobile devices as well
as computers. This would mean that a mobile phone application could be
developed and created alongside the interactive museum application, which would
provide information about different types of mobile phone, whether it be through
the mobile app itself or through redirecting to webpages of interest. Although a
visual representation of the webcam isn’t really needed within the prototype, the
inclusion of one in a gallery setting would be beneficial in attracting children and get
6. Candidate No. 107938
6
them engaging with history as it is a topic which can sometimes be seen as being
boring to younger people. This is a way of getting young people involved and
interested, seeing themselves and physically interacting with artefacts of the past,
encouraging them to engage with the history of the objects.
My application fits into the framework of interaction design, a ‘relatively new field in
the category of computer science that defines the ways in which a person can
interact with a computer system, be it a mobile device or a mainframe server’
(http://www.interactiondesign.com.au/). My application is a piece of new media,
which in its nature, is digital and interactive. ‘Where ‘old’ media offered passive
consumption new media offered interactivity.’ (Lister, 2009 p.21). The application
adheres to Lister’s (2009) meaning of the word ‘interactive as something that
‘signifies the user’s ability to directly intervene in and change the images and text
they access’ (p.22). It also adheres to Manovich’s (2001) idea of new media objects
as ‘collections of individual items, where every item has the same significance as any
other’ (p.194).
7. Candidate No. 107938
7
Bibliography
Alexander, B & Levine, A. 2008. “Web 2.0 Storytelling: Emergence of a New
Genre”.[Online]
Available from: http://chnm.gmu.edu/courses/schrum/ctch792sp10/wp-content/
uploads/2010/01/Web-2.0-Storytelling.pdf [Accessed 11 Mar 2014].
Bolter, J. D. and Grusin, R. 1999. Remediation: Understanding New Media. [E-book]
MIT Press. Available through: Sussex Study Direct
https://studydirect.sussex.ac.uk/mod/resource/view.php?id=615218 [Accessed: 11
Mar 2014].
Interactiondesign.com.au. 2014. Interaction Design: Defining The Structure &
Content of Communication. [Online] Available at:
http://www.interactiondesign.com.au/ [Accessed: 11 Mar 2014].
Lister, M. 2008. New Media: A Critical Introduction. [E-book] London: Routledge.
http://hp2.philol.msu.ru/~discours/uploadedfiles/courses/poselyagin/textbooks/Ne
w_media.pdf [Accessed: 11 Mar 2014].
Manovich, L. 2001. The Language of New Media. [E-book] Cambridge MA: MIT
Press. Available through: Sussex Study Direct
https://studydirect.sussex.ac.uk/mod/resource/view.php?id=615223 [Accessed: 11
Mar 2014].
Reactivision.sourceforge.net. 2014. reacTIVision. [Online] Available at:
http://reactivision.sourceforge.net/ [Accessed: 11 Mar 2014].
Secondstory.com. 2014. Storytelling with Sound | Second Story. [Online] Available
at: http://secondstory.com/project/storytelling-with-sound [Accessed: 11 Mar 2014].
Waltdisney.org. 2014. The Walt Disney Family Museum. [Online] Available at:
http://www.waltdisney.org/ [Accessed: 11 Mar 2014].
8. Candidate No. 107938
8
Appendix:
Materials needed:
• A computer
• MAX MSP
• Adobe Photoshop
• An internet connection to access YouTube.com
Production plan
Week Significant
Events
Things To Do
8
10 Mar – 14 Mar
Assessment
1 hand in
• Finish writing proposal.
• Test developed concept MAX patch.
• Create storyboard of how I would like the
app to function.
9
17 Mar – 21 Mar
• MAX patch development.
• Start sourcing videos which I will be
including in my application.
10
24 Mar – 28 Mar
• Further MAX patch development.
11
31 Mar – 4 Apr
• User testing of the application. Taking
people’s opinions and changing my
application accordingly to make sure the
users get the nest experience possible
out of using it.
12
7 Apr – 11 Apr End of term
• I won’t be able to make it to class this
week as I will be at the National Student
Television Awards.
• I aim to have my final application finished
by this week.
Assessment
Period Wk 1
• Final touches on application – making it
look aesthetically pleasing.
• Start writing the 250 word introduction.
Assessment
Period Wk 2
• This week will be left for contingency. If I
fall behind schedule I will use this time to
get myself back on track.
• Film the video walkthrough of the app.
Deadline: Tuesday 20th
May
10. Candidate No. 107938
10
Fig. 3: Rockwell Group. n.d. Integrated wall display at Walt Disney Family Museum.
[image online] Available at: http://www.rockwellgroup.com/projects/entry/walt-
disney-family-museum [Accessed: 11 Mar 2014].
Fig 4: Rockwell Group. n.d. Multiple video screens at Walt Disney Family Museum.
[image online] Available at: http://www.rockwellgroup.com/projects/entry/walt-
disney-family-museum [Accessed: 11 Mar 2014].
11. Candidate No. 107938
11
Fig 5: 'Steamboat Willie' Wall at Walt Disney Family Museum. n.d. [image online]
Available at: http://www.sfgate.com/bayarea/article/Disney-fans-flock-to-Presidio-
museum-on-Day-One-3216041.php#photo-2358762 [Accessed: 11 Mar 2014].