How To Recreate An Apple Advertisement 1

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 20

Research

Apple is one of the most valuable technology companies in the world, creating products
such as the iPhone, Apple Mac, iPods, iPads, Apple watch and AirPods. It is also one of the
most influential and recognisable brands in the world, responsible for the rise of the
smartphone with the iPhone.
The iPhone, created in 2008, is still one of its most desirable products, giving a rise in annual
revenue from $1.8 billion when it was fist launched in 2008 to $191.9 billion in 2021. The
brand has built its reputation on both the performance of its products and the branding. As
well as the sleek design, and simple, clean packaging, Apple adverts are renowned for their
glossy image whilst focussing on innovation and user experience. For this project, I am going
to create an Apple iPhone advert and follow it up with a survey on how successful it is by
asking participants to compare the original Apple iPhone advert with the one I have
replicated.
Initial planning
In order to replicate the feel and experience of an Apple iPhone advert I will be using
multiple and software and programs which include:
 Adobe After Effects 2020 (vfx/graphics/editing)
 Adobe Photoshop 2020 (graphics/idea planning)
 Blender (All 3D work will be done in this)
 Sony Vegas Pro 17.0 (sfx)
Apple seems to utilize lots of random buzz words which won't mean anything to your
everyday consumer, such as “11.8 billion transistors” and “80% faster neural engine” to
promote their products to a wider audience who may only know a little about phones and
just assume that a bigger number means better phone. Following the development of
Apple’s advertisements from 2007-2020, it is apparent that the more recent adverts are
much more focused on the technology and advancement of their product than the previous
years which focused on the ease of use and wide availability of being able to call your
friends. My target audience will be aimed at teenagers and younger adults as they are the
biggest buyers.
My Apple inspired advertisement will consist of the following criteria:
 Creativity
 Buzz words
 Fast-paced editing
 No linear narrative
 Sync on beat to intensify the feeling of professionalism and sophistication.
The production process
I will be using blender for all the 3d work and modelling in this advertisement. I began by
opening the software and importing a reference image of the iPhone 13.

[ref]
I modelled the design for my smart phone (reference image on the right was an iPhone 13
Pro Max) in blender, which will be duplicated and textured in several colours later down the
line. Thankfully, this iPhone’s model is relatively easy to replicate as it holds a rectangular
shape.
Next, I textured and coloured the iPhone using different materials to match the reference
image and found that using Eevee for the rendering engine would best suit my needs as I
will not be using a lot of glass-like materials (which the render engine, Cycles, excels at). This
means that render times can greatly be reduced and gives me more creative freedom to try
out different animations with few repercussions or consequences.
After running through several songs and genres that I thought would be suitable for this
style of advertisement, I narrowed it down to this song:
https://soundcloud.com/00ff1a/recycler and then proceeded to cut it down to 31 seconds
(the length of my advertisement). The song was found through SoundCloud's related tracks
section.

Now that I have found a song, I can import it into After Effects and create the main
composition. I will be using 24fps as although it is not as smooth as 30fps it also does not
have the grittiness of 20fps. It is a perfect middle ground – the song also influences the style
and shape of the advertisement through the genre, bpm, tempo etc. E.g., a slower song will
include fewer shots as there are going to be fewer sync points than a faster song. The
footage has been slowed down to showcase the importance of frame rate at a greater
effect.
20fps vs 24fps vs 30fps

Now that the song has been chosen, it is easier to visualise exactly what I want to include
and at what points. To achieve this, I have typed out my ideas on After Effects markers,
which has helped for displaying ideas I want to execute, as well as, being used as sync points
to show me when to cut/edit a shot (the white boxes with text are markers).
I am using the blender plugin, ‘true time remapping’ which recreates After Effects time
remapping in blender and ultimately makes workflow a lot easier. This requires me to add a
solid layer into the composition and work out how many frames I need to input into blender
and render for each shot. On the left is a screenshot of true time remapping from blender
and on the right is a screenshot of Adobe After Effects’ time remapping – as you can see,
they are very similar.

Once I knew how many frames I needed to render for the first shot, I created a basic
animation inspired by the movements of Apple adverts. This was achieved by keeping the
model at a still point and just moving the built-in blender camera around the iPhone.

shot1 demo compressed


During this point I was still experimenting with two render engines that come with Blender,
Eevee and Cycles, I will be sticking with Eevee for the foreseeable future of this project as
rendering out with cycles is not only more intensive and power heavy on a computer but it
pales in comparison when it comes to rendering times; Eevee only took a few seconds to
render out a frame when compared to cycles which took over a minute to render out a
single frame, this would mean if I was to render out a shot of 50 frames at a sample rate of
128 it would take over 50 minutes to render a single shot. This immediately eliminates the
use of cycles for most of the project. (The close-up shots might possibly want to use cycles
as it exceeds in rendering glass in comparison to Eevee, however I want to keep the looks
consistent).
To briefly break down the animation I simply manipulated the phone’s movement by adding
a camera to pan around the phone. The close camera angle, coupled with the fast-slow
velocity of the remapping, creates a grander sense of scale and importance whilst including
the Apple-inspired lighting.shot1 breakdown blender
Throughout every shot in the entire project that was produced in 3d, I used a combination
of area, spot and sun lighting to further align into Apple’s modern and sleek style of
advertising. Additionally, as I am using Eevee for this shot, I can use bloom to replicate the
‘glossy glow’ seen in some Apple adverts.
At this point the screen did not have any texture on it because I was having trouble with the
UV mapping of the texture, but this was fixed at a later point.
I took inspiration from this Apple advertisement and recreated a shot type and camera
movement to reuse in my own advert. The model Apple used in their advertisement
appears to have a glossier coating on the back of the phone rather than my matte finish; this
means that light will be reflected differently.

I have chosen to use a free font called “SF UI” as it was the closest font to the Apple one
that I could find. The text on the left is from my video and the text on the right is from this
Apple advertisement.
Apple appears to have a very distinct narrative in their choice of font. It offers a deliberate
tone of voice in the text “A15 bionic chip” and “True Tone Flash with 2x greater uniformity”.
To your average consumer they will not understand what any of this means, this indicates
how Apple is trying to appeal to the higher end consumers who have more money to spend
on everyday products such as phones or laptops. This is further reinforced through the use
of Apple’s simple, but recognisable, logo as well as the seemingly revolutionary technical
advances. They present themselves as the first company to introduce advanced tech such as
face recognition when its competitor, android, have been in this market for years already.

As for the A15 bionic chip I just simply used After Effects for all the text and animation in
this shot. I created a text layer with “A15” then precomposed it so I could animate the scale
attribute to increase in a fast-to-slow velocity. Next, I added the “bionic chip” text into the
precomposition to keep the same scale keyframes as well as keeping things neater.
Following this, I modified the transformation of each text layer adjacent to the square.
a15

As I wanted to fill out the intro part of the advert, I decided to use the two fast-paced beats
to showcase a different colour of the model. Using the same model of the iPhone, I created
a red version as in most Apple adverts, they prefer to showcase a range of colours as to
further engage their audience in their product. I personally actually prefer the look of this
red design than the original blue design – I am pleased with the way it turned out.
For this model, I came back to edit the screen and fix the UV mapping and added a to
overlay the screen texture. The reason I have only just done this part now is because I was
inexperienced with UV mapping and did not know how to use the texturing tools. This has
since been resolved and both models now have the screen texture mapped.

Below is the everything that has been edited and rendered so far:
demo1 (incomplete)

Breaking down the first demo in chronological order:


 The Apple logo was downloaded from Google, and I added a combination of
s_textureflux, Hue and Saturation as well as exposure to achieve a modern but
colourful look.
 Apple tend to sync on beat with their advertisements in order to achieve a more
satisfying and professional feel, which is what I have done with the text; similarly,
the text is cut up which is another Apple trope included in their adverts.
 On the iPhone 13 text, I synced the text on beat and used a similar combination of
effects, from the Apple logo at the start, but slightly altering the colour speed.
 The first shot is showcasing the whole back of the phone in a typical Apple-styled
camera movement (I used a 3d camera in blender to manipulate the phone moving
rather than animating the phone to move as it saves time and achieves a more
professional look).
 The second shot in the sequence is directly replicated from the existing Apple advert
“Now in Green” It is a tracking shot of the phone flat facing down.
 The fourth shot is two shots in one just cut at the same time; I rendered out one shot
in blue and the other in red then cut at the beat. The camera rolls and showcases the
details of the iPhone’s complex cameras. I believe this is one of the best shots in the
advertisement as the lighting and detail complements the modern feel greatly.
 As for transitioning between clips, I used s_pixelsort to create a distorted image but
only on a single frame to avoid it becoming too much and detracting from the
professional look.
 I used the same effects from the Apple logo on the text ‘colour accurate’ but only
masked it to the word ‘colour’ - The closest theory I could find to it was the stroop
effect. Furthermore, I used s_stretchframeedges to create a smoother look on the
text as it transitions in – this was going against Apple conventions as they seem to
have ‘still’ motion text.
 The following shot was of the phone being animated to spin and was taken from this
advertisement. I used different layers and texture flux under the blending mode
‘lighten’.
 In the next step, I transitioned to the text “depth control” using radial blur to help
ease in the scale transformations.
 The sequence that followed was of two subsequent fast paced close-up shots the
first shot of this sequence was inspired directly from the iPhone Xr trialer
 This A15 bionic chip sequence is clearly still in the works as I am currently working on
making the PCB.

Upon rendering it out, there are two problems that I noticed straight away, at 0:05 I used
s_lightleaks to help blend in the first iPhone shot but it needs to be changed so the light
blobs have more animation to them, I want them to be a fast-slow explosion of a blue and
green blurring gradient. Additionally, the first shot of the iPhone has a fading-in look
because of the lighting I used in blender. The shot starts at 100% opacity when it should
start at around 33% opacity and then keyframe it to fade back in. This will help to blend the
first shot with the black background. Moreover, I personally think the A15 bionic chip
animation would look better if the square was subtly animated so that it starts a little bit
thinner then as it scales fast in the beginning it quickly resets its normal size. The one frame
glitch when the phone in the ‘colour accurate’ shot transitions to the ‘depth control’
One of the biggest problems I have faced in the EPQ was animating the iPhone’s model to
move where I need it to, as the models I made had a lot of individual parts (such as the
flash, camera, screen, etc.). When moving the screen section of the iPhone it sometimes
does not move with the rest as it doesn’t seem to be linked with the rest of the model. It
became an issue when animating the “colour accurate” sequence as the model would move
but the screen would stay in place (see below).

[The purple part is the screen disconnected from the model]


To fix this I had to keyframe the phone screen’s rotation manually for each frame of the
animation as seen in the timeline in the screenshot above. Next, I simply created a new
texture for the phone screen which was a solid purple and then chroma keyed it out in After
Effects, then by adding an adjustment layer underneath with a combination of: Fill,
S_TextureFlux, Hue/Saturation, S_WarpBubble. The ‘purple screen’ layer of the composition
was rendered without any bloom in Eevee this is because the chroma key was not able to
key out the bleeding purple and left a splotchy mess. It was later identified that I could
simply link the objects together with control + L and then parent the phone’s transformation
properties to an empty object which would allow for me to control all the movements of the
phone’s model by moving an empty object.
'colour accurate' breakdown
For the modelling of the PCB, I was able to create a procedural circuit board texture and
then add a bump node with a voronoi texture to add some depth to the board. After, I
added randomly placed transistors, diodes, resistors etc. I textured the circuit board in a
white/black colour scheme as to fit with the simplified modern aesthetic. I also used a
simple glowing texture via turning the emission up to give the board a bit more life.
As for the modelling of the PCB process I used a free plug in that can be downloaded from
this link. JSPlacement is a phenomenal tool that can be used to freely generate sci-fi looking
textures and from this I can easily recreate a similar looking PCB that will tie in well with the
calculated and computational theme I am going for.

Here are some examples that have been created via the JSplacement tool:
Starting out I opened a new blender project file and then added a plane into my
environment, from here, I made sure that I was in cycles (experimental) for this project as
otherwise adaptive subdivision would not work. This was fixed by simply adding a
subdivision modifier and adjusting the values respectively. As for this last section it just
involves messing around with nodes and values so I will not be breaking it down, instead,
here is a screenshot of the final nodes completed together - as you can see it is fairly simple
to compile for such an expressive texture.
This was the point where I ran into a technical problem and was not able to render out the
project as it kept on crashing on the first frame and wouldn’t render. I quickly found out that
I was only able to render frames 11-40 which would make this scene useless. To combat
this, I just rendered out the animation in viewport rendering which is just the textured plane
without the bump or height rendered. Below is a comparison of viewport and fully
rendered.
I imported and edited the footage in After Effects but because I was not able to render the
full view footage out I had to try and mask the simple looks of the render. To achieve this, I
used s_hotspots and deep glow which was then placed below the adjustment layer with
flicker so that it all is consistent. Below is the final look:
PCB demo
The last main section of the advertisement is the ‘Liquid Retina Display’ sequence. This was
one of the original ideas that came to me when I was planning for the start of the EPQ. I
want to take direct inspiration from the existing Apple ‘iPhone XR trailer’ By replicating a
shot where the iPhone moves from left to right and has the screen facing the camera. This
was one of the hardest shots to recreate as I had a specific effect I wanted to recreate,
where the phone screen has a white line of light that reflects off it. This can be achieved by
masking an adjustment layer which has exposure and deep glow on it that is animated to
move across the screen. I had this very rough looking plan that I made as something to base
the sequence off, seen below.

As I wanted a transparent layer behind the iPhone model, I simply just turned on the
transparent option under ‘film’ in blender. Additionally, I was playing around with the
lighting tools and realised that it would turn out better if I changed the world lighting to a
solid white compared to the previous black lighting.
Liquid Retina Display blender demo

As I was not sure what to include in the prerequisite shot, I was experimenting for a while
with different effects and ideas. It came down to this shot where I used s_texture flux on
the bottom layer with another rendered out shot of the iPhone on top (for the top layer
iPhone shot I increased the focal length of the camera so that it would not be cut off from
each side). The text of liquid retina display defies the Apple criteria of ‘motionless text’ but,
in my opinion, it looks better this way. Following this, the iPhone model comes into frame
with the above animation and has s_hotspots/deep glow/exposure to help blend it in from
the previous black and white shots. I am not entirely happy with the way this was executed
but I will have to stick with it for now. Additionally, the text has optics compensation which
achieves the effect of warping it towards the camera.
Additionally, I added a simple animation of a padlock locking together as it is symbolic of
privacy to indicate a ‘teaser’ of new privacy changes coming with the iPhone 13. This is also
the same symbol which is used when unlocking an iPhone at the lock screen. Below is a
simple breakdown of the animation where I used photoshop and After Effects.
Padlock Breakdown

With this scene completed it pretty much wraps up the project all that is left is to render
and upload to YouTube. As for the rendering process, I rendered out the whole
precomposition in After Effects and upscaled to 4k and used the default After Effects video
codec.

As the rendered video came out to be 14.4Gb I compressed it in Virtual Dub using the Xvid
codec which produced a final size of 88.4Mb. Moreover, I changed the frame rate in Virtual
Dub to 48fps as YouTube will worsen the bitrate if the video is a lower framerate than 48.
Apple Advertisement Final

After this was completed, I showcased my own advert and then an official advert which was
made by Apple. I asked each person which one they thought was made by Apple and then
questioned why they thought so. In a survey of 20 people, 75% of participants could not tell
the difference between my own advert and the official Apple advertisement. Interestingly,
out of the age bracket of 30 years plus, 90% of the participants could not decipher any
difference between the two. One of the main reasons for being able to decipher the real
Apple advert was because mine did not have the small print at the end. Another reason
cited was that I did not include any real-life actors which is something that Apple has started
to incorporate in their more recent adverts.

You might also like