Final Major Project: Presentation

Bonsai VR is a narrative meta critique of VR  

  • A narrator tutors you in taking care of a bonsai tree. 
  • The apps appears friendly and calm but gradually starts to corrupt and become strange and uncomfortable
  • Tools don’t function correctly and break. Your tree is replaced and behaves strangely. The narrator tries to persuade your to look the other way and keep the tutorial going as if nothing is wrong.

My core idea is to create a faux meditative and wholesome bonsai tutorial VR app that gradually becomes more malicious and corrupted with uncomfortable and insidious themes of absurd horror.

Themes

  • Time commitment vs Low attention span (“here is something I made earlier”)
  • Creative consequences (undo button)
  • The extinction of boredom as a result of Distractions Mobile phone, digital media like shorts content. (tik tok, youtube and instagram)
  • Capitalism traditions and values.
  • Mass commodification
  • B̶̶u̶̶r̶̶e̶̶a̶̶u̶̶c̶̶r̶̶a̶̶c̶̶y̶̶ ̶a̶̶l̶̶t̶̶h̶̶o̶̶u̶̶g̶̶h̶̶ ̶n̶̶e̶̶c̶̶e̶̶s̶̶s̶̶a̶̶r̶̶y̶̶ ̶s̶̶u̶̶c̶̶k̶̶s̶̶
  • Physical vs Digital mediums (film vs digital photography) and what is lost when we make art digitally.
  • The short sighted ,senseless and foolish creation of apps made for VR.
  • Rejection of nature.

Audience and installation.

  • Aiming to create a VR experience for LCC’s grad show. 
  • Keep people from overstaying their welcome whilst allowing for user initiative. Keeping it Relatively short and possibly submittable to XR film festivals.
  • methods of hand holding the user through dialogue and interactions with the narrator to keep the experience flowing.
  •  Installation elements could be considered in order to draw intrigue and create content for people outside the VR experience walking by or watching.

https://docs.google.com/presentation/d/1xMsm67JjBqj-X12401RMTkthaVBSGsmFLCv9zmR8jiQ/edit?usp=sharing

Full Slides presentation

Here We Go Again.

Contextual & Theoretical Studies and Virtual Spaces

So after wrapping up are previous 2 projects and after a week break for reading and watching plenty of youtube. we now have 2 new projects to get on with.

Contextual & Theoretical Studies

For Contextual & Theoretical Studies we have started to look deeper into the state of the art and what is currently presently being adopted into the industry. by looking at papers and articles submitted on sites like ACM Digital Library & Research Gate. I will be writing an essay for this unit “A portfolio of writing that connects practise with theory and research and demonstrates a clear understanding of the key themes addressed in the unit lectures 1,600 words” as these last couple of weeks I had a number of ideas that I would love to write about in my essay these being.

  1. VR overlays and multiprogram communication in VR and VRm minigames/metagames
  2. Will the wider adoption of VR by the public lead to less effective VR in therapy How much experience with VR effects bias influence.
  3. Does increase detail in games lower the development of imagination
  4. Anonymity how it changes behaviour and if it differs in VR
  5. 2nd persons place in VR narrative and perspective position

I narrowed my decision down to idea 1 and 5 as the other seemed to be more objective research-based questions that would be better explored in technical papers. I leaning more towards writing about the 2nd person as I have already thought intensively about this beforehand after playing “Moss” about a year ago. as much as overlay games like Portable Farm interest me there aren’t many examples outside of that game and its a rather new concept that I would prefer to revisit later where I want to get 2nd person out of the way as there is a lot of variety in VR perspective.

Virtual Space

somethings I take seriously but rarely is it myself and my work. So I might have to work with a few loopholes when creating my serious VR interactive experience for my Virtual Spaces unit.

I started to surround myself with serious topics to try and spark some inspiration tracking creating the Padlet above to try to figure out my approach 1st watching some Vice documentaries on world affairs and issues. I then realised although it is imperative that these issues be explored, Billy Beeson aged 22 is not that person and even after just reading about these issues a smidge I’m already emotionally exhausted.

So looking elsewhere Wendover productions had a lot of inspiring videos on the world environment and economics. I had a brief time working as a bin guy over one summer an exuberance that has made me think about waste and where it ends up. however, coming up with a serious and engaging way to dispose of trash altho plausible seemed unlikely. It could be fun in an arcade genre where you cleaning up a beach or working a recycling plant conveyer belt in order to build up points multipliers to reach s new high score.

Then after looking into social and personal issues through the lens of Healthy Gamer. gg I tried a different approach looking at serious games and the kind of interactions I would like to create. In the hopes of using game design and mechanics to tell the story. Dwelling on things like the polarisation of thought and the enforcement of extreme opinion. It would be possible to address these through game mechanics alone and wouldn’t need to be heavy-handed in its message, where you could teach the player through the game mechanics alone where the goal is to balance two extremes and avoid the extremes of 2 outcomes. I like the idea of creating a game with only a loose state or a game the lends itself to speedrunning.

Music Video: The Frog, the Tadpole and the Wardrobe #4

Tadpole

I modelled my tadpole in Maya fairly quickly and after watching a GDC talk on Creating the Art of ABZU I followed part of a Fish Shader Graph Tutorial to create its movement with Time and Sine node and using that to manipulate the vertex. the main issue I ran into was that the tadpoles texture UV’s weren’t aligned or cut down the right axis so as half the model would go right while the other would separate and go the opposite direction because the shader projected onto the model wasn’t uniform. easily fixed by going back into Maya and sewing and creating a couple seams along with a quick UV unwrap. then added a gradient that I could control with a vector to mask and ramp the “vertex wiggle” of the head.

Now that I had a tadpole I created an empty game object “Carousel” (that clearly I didn’t know the spelling for*) with a tadpole in positions like on a compass then duplicating and rotating 45 degrees. to give the impression that the tadpoles were under the water without actually being under the opaque plane I made there scale on the Y-axis 0 to flatten them down to silhouettes. Then in the in a similar way to the door, I used a timeline to record the parent “Carousel” rotating 360 degrees and then converted the keyframes into a clip track to have it loop indefinitely. I then repeated this for a clockwise rotation.

Magic Door

I spent quite a lot of time thinking about the magic door in the centre of my app. Inspired by The Doctors Tardis, Monsters.INC and “The Lion Witch and the Wardrobe”. I went looking on the unity asset store and created a new project to experiment in. I First looked at Pocket Portal VR that works by having the two worlds ontop of one another however this seemed quite advance and although It is relatively user friendly, I was unsure how I could shift the girl from one side of the portal while in the prosses of her animations. So I decided to look elsewhere. However, I did use the wood texture in the pocket portal package to texture the Door I made with simply scaled Unity cube primitives grouping them as a prefab.

I was confused about why the doorknob was flat to the door but this was because the doorknob was scaled in relation to the parent door that I had previously scaled on that axis to 0. this was a lesson in having simple and straight forward transforms placed on parent objects as to not complicate things when those traits are passed down. So after rebuilding the door with a stronger foundation, I added an empty game object where the hinge should be setting the transform to pivot and parented the door and knob to it. I then used the timeline to record keyframes of the door in an open and closed position using the graph editor to smoothen the motion toward the end.

Eventually, I Ran into this Portal stencil shader tutorial, this was exactly what I was looking for however the tutorial used custom textures and didn’t seem to work as expected with universal render pipeline and shader graph. leading me to find this AR stencil and shader with shader graph tutorial that I found far easier to follow taught me how to convert shader graph scripts to custom shader scrips. So I used the Clear “Portal” Stencil shader that renders materials with Stencil Ref 0. By placing a plane with the Portal material in the doorway and replacing all the room and girl materials with custom shader graphs with “Stencil { Ref 0 Comp NotEqual Pass keep}”. This worked a treat, however, I needed to switch the Cull of the Portals Back as the back of the portal would render the room even when the room is in front of the portal.

I will need to look into how to switch the girls shaders mid-game so that she will render when she steps through the door otherwise I will have to have her spawn front of the door or have her stay inside the room.

Edit: In hindsight having the door stay open thought the music video would put it to better use as I feel like having the door sink away after 10 seconds is a shame and as would have looked cleaner considering I didn’t manage to find a way to change the girl’s shaders as she walked through the door before my presentation. Where having the door stay would have created an opportunity for the girl to move about her room possibly give the frog man a show and tell or create a line between the characters too for some interesting back and forth mimicry/dance.

Music Video: Aqua #3

I am fond of Node-based programming and have dipped my toes into blenders shader nodes previously and have been eager to teach myself unity shader graph because of its potential interactivity and having seen its capabilities being used in VR chat.

I started to follow this tutorial on using the shader graph to create rain ripples but dropped it after my lecturer provided me material showing how to use unity particle system using a PNG and fading out a billboard with its lifetime. However, I wanting to use shader graph I figured that I would be able to use the particles lifetime as a control for the width of the ring and have it so as the ring grows in size, it thins until disappearing. But to get myself familiar with unity shader graph I decided to first use it to create an animated toon texture using this Wind Waker Water Tutorial as a guide.

I then realised that the Shader Graph doesn’t work without the Universal render package. but thankfully unlit shaders are very quick and easy to convert to URP shaders and can all be selected and changed at once.

After watching a tutorial on Custome Vertex Streams I figured out I could set a channel in the UV node to do just that by splitting the “B” channel of “RGB” also know as “Z” channel “XYZ”. That channel would carry the time value going from 0 to 1 as a decimal percentage of the particles lifetime. However, this ended up with the ring getting thicker as the decimal counts up leaving the ring with a thickness of 1 at the end. By using “(1-y=x)” this would flip the decimal percentage and give the intended outcome. I couldn’t figure out how to use shader colour however in tandem with partial colour so I couldn’t get the shader to fade as its lifetime ended, however, I know this to be possible however I moved on before figuring it out.

I wanted there to be a visual trigger for the ripples to create anticipation as It plays in time with the music. I first created a sphere that I scaled along the Y and gave it a version of the water shader with increased strength and decreased size. I liked the look of this droplet but due to its detail and busy nature in one hand, it felt out of place not fitting with my flat aesthetic and in the other using the same shader left if hard to read next to the water plane. Luckily I found a solution while watching rain tutorials and found this 2D Rain tutorial. I used this animated sprite billboard locked on an axis as my rain particle.

All that was left was to parent the Ripple to the Drop and set the Ripple as a collision emitter. For a while I had it set to the world so any collider including the player would trigger the ripple but the sprite’s hitbox is slightly larger than the image so the ripple would spawn about a foot above the water so I used a plane collider instead that’s slightly below the surface to align the ripple nicely with the water plane (this should also improve performance and eliminated variables in their behaviour).

I used two techniques to control where the particle would emit I had a single emitter trigger to the beat of the music I then created position keyframes that would move the emitter to where I wanted them. I had to set the particles relative to the world so the particles wouldn’t follow the emitter as it moved around. I then created a mesh with 7 vertexes in the locations where I wanted them to spawn and then triggered a burst with the timeline that spawns 7 drops 1 on each vertex at once. I created a second mesh emitter lower on the Water Plane as well, to trigger Ripples in the same spot just without the Rain Drop.

I showed a Flatmate my project and without missing a beat the 1st thing they said: “Could you make it prettier?”. I didn’t quite understand what she meant by that. I had set myself limits by using unlit shaders and I was very fond of the simple paper cut-out look it gave. So I started thinking about adding gradients to the materials or how I might create a “fake light”, a cone that would apply a hue over materials render behind it or perhaps use a stencil like I had used with the magic door. After asking her what she meant by suggesting it wasn’t pretty. she used Walkabout Mini Golf as a reference, as we had played it the night before. The awe she felt standing on the course looking up at the towering cliffs and down into the deepest depths.

I realised, for the most part, everything in my game felt flat and that from a design stance defeated one of the key joys of VR. So I simply duplicated the water plane and lowered it creating a sea bed I then created a copy of the material adding a vector to the alpha channel. the Top-Plane was made translucent and the Lower-Plane was made greyscale as to take on the top planes hue. Finishing it off by lowering the shaders scale vector to create the look of light refracting on the sand. This gave the added benefit of being able to simply lower the tadpoles below the water and having a variety of depths for them to occupy.

I now noticed that the water ripple particles where sometimes being drawn bellow the water and even more egregiously on top of the lily pads. By changing the LillyPad shader to Transparent with a 100% opacity fixing its Render Queue to 3000 and then shader graph allowed me to set the ripple and waters queue to 2999 and 2998 meaning that the LillyPad would always be on top of those materials no matter their transform.

Music Video: Asset Optimising #2

The Girl

Whilst looking for a model to use for my girl character I found these “MMD Vroid raincoat models” on Deviantart.com. MMD models are made for a piece of software called “MikuMikuDance” for the purpose of making CG fan dance videos. I have never used the software but MMD’s are widely used in the game “VRchat” as player Avatars. MMD and PMX files can be converted into FBX files with a Blender plugin “Give me all your cats” that comes with many useful features like quick material separation and automatic texture atlas.

The model was humanoid ready and only had to change the toes that had rigged themselves to the foot and the foot to the ankle. The model also came with bones weight painted to the “skirt” and hair that I let be for the time being as I plan to use the Dynamic Bones plugin that I purchased in the new year’s sale to add physics-driven swaying to them. In case, I don’t end up doing this I should remove the bones, as although having them their idle probably isn’t impacting performance greatly when it comes to optimising for mobile every little helps.

I modified the model to Simplify and optimise it for mobile VR cutting down from fifteen to five materials. Although I used only three material slots in unity but left two as to leave the option to use the model’s textured atlas in the event that I decide to change the visual style at a later point. I also decided to modify her clothes to align more with westerns abilities elongating her coat by scaling it along the Y-axis. I also made plans to add a hood similar to the one I sketched out in my Quill Maquet but decided that the character would probably read better if you could see her hair from behind.

I played around with different colour ideas originally believing I would use the Songs album cover as a direct colour reference and have the pink floor ripple to convey it as a liquid however this didn’t read as well as I would have liked. so regular blue water was what I ended up going with. and used Simple Sky, a gradient sky asset that covers up the solid sky that shared its colour with water plain, ging the appearance of an ongoing plain of water.

  • Double-sided “no cull” hair and clothes.
  • Unlit shaders { with GPU instancing turned on} gives me high-performance results that look good without lights present.

Room

I decided I want the girl to have a clear location rather than an indiscriminate void. So as I have been listening to a lot of Lo-Fi while making the “Game” I thought it would be cute to find the room used in a lot of the music playlist thumbnails. luckily there had been a challenge to recreate just that on Sketchfab and using there Blender plugin imported it into Blender to optimise the room for Unity. using Decimate, and creating and joining meshes. whilst creating a doorway with a separate material that I could attach a transparent portal shader to. Fully enclosing the model.

I created a version of the model with a box coming out the front to use as a transition point for the girl to switch her shaders however I discarded this version of the model as the box would reveal the wall around the portal as well as the girl.

I used the “Easy Combined” asset in unity to quickly join all the objects in the room into one mesh as to cut down on draw calls as seen in the Profiler.

I found a Toxic Frog asset on the unity store. however, this frogs skeleton isn’t humanoid so if I wanted It do imitate the girl’s animations I would need to convert the frog. I moved the frog into a standing upright position in Blender pose mode and weight painted the eyes to move with the head. I then tried using CATS fix rig fusion this gave me a humanoid rig with blend shapes for the eyes mouth and tough still intact but unfortunately Maximo wouldn’t work with this setup.

Ended up removing the rig entirely and keeping the Pose enforced by CATS on the mesh and importing the hollow mesh in Maximo where it generated a new simpler mesh that naturally would work perfectly with Maximo’s motion capture library. It wasn’t quite that simple as the jaw seemed to be independent of head turns so using blender again I edited the weight baiting on the Jaw and head bone and used sculpting tools to slim down his body. In unity, I swapped his materials for URP unlit shaders similar to everything else and created an animation controller that would blend humanoid animtaions together.

Music Video: Pre-Production #1

I want to pick a song for my music video that I hadn’t heard before as I’m aware that ill have to listen to it over and over again I’m also going to be wherry of length. I went through a playlist made by a Twitch streamer I Enjoy and listened to each track. specifically listening to see if any of the music conjures any strong images in my head that I can build from.

I ended up picking “Frog Summer” probably due to it containing diegetic sounds and clear sound queues. Also who doesn’t Love Frogs!

An Idea I played with was integrating quill into my production pipeline and played around, importing the maquette I had made into Maya, with stronger knowledge and mesh discipline in quill I could see it possible to at least use the models made in Quil to use as a reference as a trace for modelling in Maya or in a best-case scenario use them directly. However, the topology generated by quills brush strokes is in most cases seemingly unusable. I would love to revisit quill or perhaps another tool but until modelling features like “Bloolen” are implemented it will remain a quick and effective way of developing a visual look and feel.

I’m glad I tried to use Quill as it helped mobilise my brain into thinking in the 3rd Dimension whereas creating storyboards naturally was limiting I also found that storyboarding with the 360 video template was even more unhelpful as everything is laid out in relation to the user and unlike 360 video true 6 DOF VR allows the user to move freely around the scene. my hypothesis is that planing out scenes for VR would work far better-using technics shade by stage performance as theatre requires planing for multiple viewing angles better than films single fixed viewing angle.

  • In our recorded content and lectures we looked at scripts that let us change the unity scene as well as simple game interactions and locomotion with teleportation.

My Idea is lacking any kind of VR user interactions so far. It would be good to set up bounds for the player to move around the scene perhaps using the VR teleportation locomotion that we looked at in a lesson. To make use of VR I thought It would be a cool idea to have it so the way you start is by throwing a frog spawn into the water that would trigger the music and then have it so dragging the egg smaller or larger would, in turn, change the scale of the player allowing a viewer to watch it multiple times from different sizes.

AR Detective game

(Ongoing! tbf)

Our 1st task as part of the introduction to the virtual reality unit is to make an AR detective game. I created a “Padlet” to plan out a linear game where after finding a dead body you have to follow clues like footprints and blood splats to find a terrifying supernatural murder.

Step 2 feeling out unity and AR tools

We have been using

Step 3 understanding technology going back to the drawing board.

Step 4 creating a prototype

Step 5 Adding a little flair with Maya and win state

Global Campus Studio

Taking part of the global campus has been a great way to set my expectations and grow my understanding of what kind of work I am capable of producing at such an early point in my studies.

I had the pleasure of working with students from other LCC courses as well as students from Ryerson University Cannada, As Part of the “Global Campus Studio”. Students from Canada had started planing the project weeks prior to us UK students joining the progect.

The group’s idea revolved around having live performances in a vertical environment as an alternative to attending music festivals and Gigs. Having it presented as a social exuberance with an emphasis on sharing the digital space with converging musical taste creating a social meeting place for music lovers and creatives.

As the only student in the group studying VR, I found that I could help the others develop their understandings of the medium and the nature of the industry. Where we discussed the difficulties of presenting VR in a faithful way through other mediums like the once used in modern advertising. identifying that the is a divide in the market between mobile and desktop VR and help device ways to entice both audiences (Ongoing! tbf)

collaboration with Stefania Fiorucci

In creating the 360 video used to give a

https://vrmusicc.wixsite.com/website

https://www.ryerson.ca/fcad/international/global-campus-studio/gcs-creativity/

Virtual Principles Week 1 and Oscilloscopes

So it begins! I was surprised at how quickly the lectures have gotten me engaged with the subject. We went over some moving picture history going all the way back to the work of Eadweard Muybridge and Georges Méliès, some staples from my film and animation classes. However, I found the work of Mary Ellen Bute’s particularly interesting, who used an Oscilloscope (a scientific instrument that displays voltage over time) that uses a stereo to create abstract images in the early to mid-1900s.

Her work reminds me of the more modern work of Jeroboam Fenderson & “HANSI3D” two Austrian artists that take the idea and run with it, using the music itself to create the images and without the aid of film’s ability to be cut stitched and used as assets to the greater peace. It’s just pure Oscilloscope noise with no orchestral accompaniment. https://oscilloscopemusic.com/

Watch this smarter everyday video for more info.

In the frame of VR I wonder if you would be able to get a similar result with an XYZ-axis adding an extra dimension to the usual X&Y; would you be able to extract the 3rd channel out of a stereo signal? You might get some 3D results with both conventional mono and 2-channel stereo. food for thought and I might revisit the Oscilloscope someday.

P.S. My 1st month living in London has been full of mixed emotions. The course is picking up pace after a slow start with most having a rather coy approach opting to type and go without a webcam in the web classes. I expected that but thought it might be different due to the class’s small size of 7 students.

 After meeting in person on Friday people are becoming a little more comfortable with each other and I have nothing but respect for a head of year Anne having to lead the 1st year of this course through this pain in the arse pandemic. 

I can’t wait until we can take our masks off in class because with a clogged right ear I can’t hear too good and it’s impossible to lip read. Without mentioning the endlessly fogged glasses.