Upon volunteering with the UOW Makerspace team at the start of the session, a few things quickly became apparent.
- The Makerspace is very popular among STEM students.
- The Makerspace is not very popular outside STEM students.
- Many people entering the Makerspace have no idea how to use the tech.
As such, I initially conceptualised filming half a dozen videos that would hypothetically replace a large part of the inductions that students are currently required to undertake for many of the tools inside the facility. Moreover, I decided I could film using equipment from the Makerspace itself; the Samsung Gear 360 camera being a nice showcase of the technology available within. I stumbled upon research by YouTube which suggested that 360 video motivates viewers to ‘get in on the action’ by moving about and controlling their perspective, and it ‘also makes them want to share’ the experience which sounded like a great opportunity for the Makerspace to gain some word-of-mouth. This choice also segued nicely into featuring another technology that the Makerspace has in spades: VR headsets. Virtual Reality headsets greatly increase the value of 360 video with their ability to immediately respond to your turning movements. This heightens the sense of immersion more than if it were simply watched on Facebook or YouTube on a flat screen.
Another benefit that I had envisioned, was if filmed from a first-person perspective, this might work exceptionally well for teaching the viewer exactly how to operate the technology. Eventually, my concept changed to using 360 videos as an introductory device for equipment that required inductions to use. As approximately one-minute videos, they would instead show students the basics of the technology in question, and then prompt the viewer to sign-up for an induction if the video interested them. I will touch more on why this changed later under ‘trajectory’.
The first step in my methodology was embracing the F.I.S.T approach by simply grabbing the 360 camera and seeing what worked and what didn’t work. For example, I quickly filmed some footage to understand what my workflow was going to be like. After doing a grandiose sweep of the ‘space with the camera, I installed the software bundled with the camera on a computer of choice, and fumbled with it until I understood how to proceed. One indispensable point of reference for this entire project, particularly in the formative, experimental stages, was this video tutorial by ‘Geoff’ at geoffmobile.com. That resource broke-down the vital technical minutia including the ‘stitching’ process with the proprietary software, as well as what settings to fiddle with in Premier Pro afterwards such as the ‘offset’ function and final export settings.
After filming some footage, I would ‘stitch’ it either by using the PC software or by Bluetooth’ing it to the Makerspace’s own Samsung Galaxy S7 directly via the Gear 360 itself. This was a suitable alternative if the footage was less than approximately 50 megabytes in size. Larger than that, and it was quicker to go through the PC software. The next step was viewing the test footage on the Gear VR. Once copied to the phone, it was possible to see how the end-product would look in virtual reality, and so, if that was a valid technique for shooting. Then, it was often a matter of taking advantage of opportunities where someone was using the technology that I desired to film and asking if I could film their work. Other times, such as during the Open Day on 19th August, I aimed to take advantage of the increased foot traffic and get as much footage as possible for the video on VR, with mixed results.
After checking the footage was useable on the PC with the stitching software, I had to change workstations to use Premier Pro on the Makerspace Macs (I didn’t put much consideration into which computer I installed the single-licence program on). I previously had no experience with Premiere Pro, so basic YouTube tutorials such as this one helped me understand how to do simple things like fade-ins, add text titles to the start of videos, and managing different audio levels. As I was cobbling together my footage in Premier, I noticed that some video clips had considerable ‘shaking’ that wasn’t immediately obvious. This user-guide helped me to stabilize clips such as the time-lapse in the carving machine which would have otherwise been unusable. I would start thinking about what my voice over should be saying over the video. I would then consult the most knowledgeable Makerspace official on that technology, be it Nathan the coordinator, or casual staff members Tom and Michael. It would be amiss of me to not sincerely thank, and credit, these three individuals for the completion of these videos. They contributed not only to the spoken component of the videos, but were also there to suggest solutions and alternatives when something didn’t work out. As such, they were another crucial source in assembling this project. After the commentary was recorded, using a quality ‘Rode Podcaster’ microphone from the Makerspace, naturally, I sourced background music for the videos using ‘freemusicarchive.org’. This website provided me with almost too much choice; it was often a time-consuming process to merely wade through the thousands of applicable tracks.
The final videos would represent an exercise in compromise, as I fought to bend both the edited footage and the spoken commentary into a cohesive whole. This was the result of my indecision on whether to write the ‘script’ first, or film and edit a smooth-flowing video, something I will need to choose a direction on going forward. Having edited the video in Premiere Pro as ‘equirectangular’ footage, effectively a ‘flat’ image like typical maps of the Earth, this would need to be corrected.
While some video players such as ‘Fulldrive VR’ on the Play Store can wrap footage in real-time, YouTube, Facebook and other mobile VR platforms would need the final exported video to be transformed back into 360. This necessitated using a tool called the Spatial Media Metadata Injector. This script wraps the exported ‘flat’ video file around a digital sphere, resulting in an ‘injected’ version, effectively bring the process to a conclusion. I then uploaded the completed video as ‘unlisted’ to YouTube, and sent a link to the media team who could watch it and make sure there’s nothing that needs changing. Assuming there were no problems, I’d ask the team to distribute them on the social media accounts at regular intervals to space them out.
With the Makerspace, and the Makerspace Club by extension, being new features on the UOW landscape, any content that can be disseminated and associated with them would be of value to that community. However, as I touched on earlier, I believe the project has three main utilities.
- Giving the Makerspace visibility beyond STEM students (much of the tech is perfect for media and creative arts applications).
- Introducing students to technologies that they aren’t familiar with; encouraging further engagement through inductions.
- A showpiece for myself that demonstrates my ability to work with cutting-edge technologies, people, video and audio.
A secondary benefit of the project has been my own initiation with, and discovery ofinterest in, machinery that I otherwise wouldn’t have given the time of day. Having conquered the VR and 3D printing arenas, I turned my attention to the automatic embroidery machine for another video. It didn’t take me long to fully appreciate the utility of the device, even if only for my own irrelevant pet-projects.
The initial concept of filming the Makerspace inductions themselves had to change for a few reasons. Primarily, formal inductions require signed paperwork by the participant indicating that they acknowledge any risks involved, and they’ve been supervised by an assessor. With a video that one watches alone, there’s no way to verify the viewer actually understands the information. The second reason was that filming in first-person at length, while also maintaining a quality and informative presentation, would be simply too difficult. I did look into the possibility; this blog post details how to make a DIY Hat Mount for your Gear 360 that’s ‘not entirely nauseating’ didn’t leave me convinced. The reality of enormous file sizes would pose yet another issue. The final, and perhaps most practical reason for the change to short, introductory videos was that nobody wants to have a relatively low-resolution screen plastered to their face for 45 minutes.
Before settling on the Makerspace’s own Gear 360 camera, this source motivated me to consider all the options available to me first. I spoke with AV Services in building 20, AV Services in building 23, IMTS in the library and even the T&T Hub find the ideal equipment before starting the project. With the suggestion by Makerspace co-ordinator Nathan that I could make use of the other 360 camera available to him, that sealed the deal. In the end, I had to make-do with the single Gear 360 camera.
With the Makerspace being closed on weekends and after 6pm, I still needed to work on my videos regardless. With the single-use software that came with the Gear 360 being installed on a Makerspace PC, and the accompanying Galaxy S7 that would otherwise be used to stitch footage tucked away, I often had to wait until I could return to the Makerspace. That was until I came across this Reddit thread which detailed how to obtain a 30-day trial with some finagling. This allowed me to complete the time-consuming process of stitching at home, in the background. The next obstacle was Premiere Pro, the trial of which only lasted a week and ran awfully on my respectable PC. The Preview video I made for my beta presentation took hours to assemble and render, inexplicably.
This meant that the final stretch of editing videos occurred in a flurry of different circumstances. Initially with the use of Makerspace Macs, library Macs on the weekend with my own laptop for recording commentary with Audacity in quiet locations, and my girlfriend’s laptop to continue editing on Premiere during the library’s closing periods. The Gear VR being inaccessible also prompted us to swap phones during the period, so I could test my exported videos on her Google Pixel + Daydream VR configuration instead.
The push to get all 5 videos done for the machinery that requires inductions has meant that I have had to put a pin in the video on the Virtual Reality headsets for the moment. With all the footage ready to go, I look forward to completing it is as a fun addition to the main 5 videos in the next couple days and incorporating 360 footage of VR games.
This has been one of the most rewarding, motivating and challenging projects I have undertaken in years. Each video posed a different problem that I relished in solving.
- How do you effectively make use of 360 filming to showcase examples of what you can make with a technology?
- How do you record virtual reality footage and present it as 360 video?
- How do you capture an hour-long manufacturing process inside a one-minute video?
- How do you affix a 360 camera to the inside of a laser cutter or carving machine?
These were all stimulating challenges that I genuinely loved encountering and overcoming. Beyond mere enjoyment, I also managed to complete a video for every Makerspace technology that requires an induction to use, which was my goal.
Despite the great personal satisfaction gained, and having accomplished what I set out to do, it certainly wasn’t easy or a smooth process.
- 360 + VR required much more time to edit than traditional video; had to ensure it looked fine after editing.
- Reality of project meant there was a lot of back-and-forth between not only editing stages, but the uni and my home to record, stitching and edit. This constantly presented issues of compatibility, different licencing affordances (fonts, plug-ins).
- The Samsung Gear 360 2016 model camera would frequently overheat and shut off at critical times; a well-known problem of that model. Would often die at critical times, i.e. just as embroidery machine started to sew.
- The final videos look crisp and high-resolution in 4K on a flat screen, but when viewed in VR (as intended) they look a bit blurry. Perhaps the immersion factor makes up this this shortcoming; I’ll let the audience decide.
I’d like to finish by giving special thanks to my friends Elysse Turner, for encouraging me, lending her equipment and professional expertise, and Cường Lâm, who assisted me in filming and could always be counted on to provide excellent feedback and suggestions.
As hyperlinked in the body:
- Is 360 Video Worth It?, https://www.thinkwithgoogle.com/advertising-channels/video/360-video-advertising/
- How to Edit Samsung Gear 360 video in Adobe Premiere, https://www.youtube.com/watch?v=wO7FNAnqkH0
- Adobe Premiere Pro for Absolute Beginners, https://www.youtube.com/watch?v=aMeHRRWNGgA
- Stabilize motion with the Warp Stabilizer effect, https://helpx.adobe.com/au/premiere-pro/using/stabilize-motion-warp-stabilizer-effect.html
- Free Music Archive, http://freemusicarchive.org/
- Fulldive VR, https://play.google.com/store/apps/details?id=in.fulldive.shell&hl=en
- Spatial Media Metadata Injector, https://github.com/google/spatial-media/releases
- Samsung Gear 360 Camera – DIY Hat Mount, http://www.rhizomelabs.com/single-post/2016/06/22/Samsung-Gear-360-Camera-DIY-Hat-Mount
- 4K 360 camera comparison, http://360rumors.com/2016/09/4k-360-camera-comparison-nikon.html
- Action Director missing serial work around, https://www.reddit.com/r/Gear360/comments/5mkvy0/action_director_missing_serial_work_around/
- 360 Capture SDK, https://github.com/facebook/360-Capture-SDK
- Potential Gear 360 Major Flaw: Overheating, https://www.reddit.com/r/Gear360/comments/4k6ixd/potential_gear_360_major_flaw_overheating/