Netscapes: Insight – IVT Testing

Today we did our final build and first live test in the Immersive Vision Theatre (IVT). We started by fitting the Raspberry Pi and touchscreen inside the plinth, then transporting the equipment to the dome ready for our presentation.

IMG_20180122_160933

Fitting Pi3 + Touchscreen

Chris added wooden beams to support the weight of the Pi, as it will be under a lot of pressure when the touchscreen is in use. This should prevent the touchscreen moving away from the pinth.

IMG_20180122_150137.jpg

Setting up in the IVT – Modifying Code

Whilst in the IVT, Gintare updated her code to work better within the shape of the screen. She moved some of the key elements of the visuals so they were more centered within the dome, bringing them to the viewer’s attention.

 

vizlaptop.png

Setting up the visualization

We transported the physical part of our project to the IVT and decided where to set it up. We then tested the project within the space to understand how it will look and feel to the viewers and how the colours will display in the dome.

head interface.png

Glass head with touchscreen interface

We took this as an opportunity to double-check our database connections were working. During this time we ran into issues with page refreshing (which I quickly resolved) and with internet connection, which we resolved by using a mobile access point.

headdemo.png

Glass head interface in front of the projection.

We even invited Luke to test out our user interface, and have a go at inputting his own data into the visualization!

head interaction.png

Luke testing out the user interface!

dometest

Head test with visualization within the dome.

Advertisements

Netscapes: Development Process

Creating a project is an organic process that contains many twists and turns. Below I will outline some of the changes we had to make during the development of our project.

Before Christmas break, Chris built a wooden plinth to mount the glass head on & house all the electronics. He also designed & 3D printed an inner diffuser for our lighting. This will be displayed in the IVT theatre, as the interactive front-end of our project.

IMG_20180109_164348.jpg

Glass head mounted on plinth (without diffuser). The gap at the front will house the Raspberry Pi & GPIO Touchscreen.

Modified Slider/LED control for Arduino

To further improve the LED lighting part of our piece, we decided to modify it by removing the serial connection and instead using a Bluetooth connection. Chris purchased a Bluetooth module and began to program it to take inputs from mobile.

Chris and I worked together to program the RGB LED code with Bluetooth. We tested the connection using Bluetooth terminal on our Android devices; sending simple “a” and “b” messages to turn an LED on and off remotely. We discovered that this will only work with one device at a time, so we will need to account for this when the model is on display.

We decided on making a mobile app to control the colour of the LEDs, which I will build in Processing over the next few days.

Resolving LED brightness issues 

We found that with the 3D printed inner diffuser in place, The RGB LEDs were not bright enough to light up the glass head.

IMG_20180109_151940

Original setup, with multiple RGB LEDs, Arduino & Bluetooth module.

img_20180121_200116.jpg

Neopixel 24 LED Ring

We tried an LED ring (that I have been using in another project) since it is considerably brighter than the individual LEDs. This worked much better; the colour was visible even in the brightly lit room, and the ring diameter was a perfect fit for the base of the diffuser!

IMG_20180112_132148

Glass head with diffuser and LED ring.

We purchased another LED ring and cut new holes in the mount to accommodate the wiring.

Switching and Setting up databases.

Due to issues connecting to our MongoDB database, we decided to switch from MongoDB to MySQL.

I set up a new database on my server with access to the necessary tables. I sent Gintaré a link to instructions on how to set it up, along with the necessary details, so she can get to work building the data visualization.

Next Steps
Our next steps are to:

  • Wire up the LED ring and program it to respond to Bluetooth messages (modifying earlier code)
  • Develop an android app
  • Connect the visualization and the slider inputs to my server/Database.

 

Netscapes – Project Planning – Technologies

After developing our idea further, here are the technologies we are planning to use to create our project:

MySQL/MongoDB
For storage of collected data from our app. SQL may be more appropriate for its ordered approach, but does not work easily with Phonegap.
MongoDB has plugins for Phonegap and may be considered a newer technology, thus more appropriate for our brief. (MongoDB, 2017)

Phonegap

We will use Phonegap to build a basic app containing input fields that users can download. Phonegap has a handy online compiler, meaning that apps can be built quickly that will work across multiple mobile OS’s. (PhoneGap, 2017)

Raspberry Pi

A Raspberry Pi will be used for input of data without the use of a mobile phone. This will be achieved by pairing it with a touchscreen hat. We will use this for presentation within the IVT.

P5.js

p5.js is a javascript library with drawing functionalities. We will use p5.js for creating interactive web-based visualizations with server connections. (P5js.org, 2017)

Immersive Vision Theatre (IVT)

We will use the Immersive vision theatre for the large scale presentation of our project. Users will be able to come in and view their data visualized in real time.


Sources:

MongoDB. (2017). MongoDB for GIANT Ideas. [online] Available at: https://www.mongodb.com/ [Accessed 25 Nov. 2017].

PhoneGap, A. (2017). PhoneGap. [online] Phonegap.com. Available at: https://phonegap.com/ [Accessed 25 Nov. 2017].

P5js.org. (2017). p5.js | home. [online] Available at: https://p5js.org/ [Accessed 25 Nov. 2017].

3D Modelling for the Immersive Vision Theatre

We were tasked with creating a 10 second animation in Blender for the Immersive Vision Theatre. We had roughly 2 weeks to create it, so we had to keep it simple yet still demonstrate our knowledge of Blender and 3D modelling.

We all decided to create a different game each. Meg (View her website Here!) created Space Chess, Jack made Space invaders, Rachel made Pac-Man, Harry made Tetris, James made Monopoly, and I made first-person Sonic, as I felt this would work well in the IVT.

I decided to go for a low-poly aesthetic as we had limited time, so I needed to keep it simple whilst still making it look good. Having previous experience of Blender, I was able to easily create assets for my animation such as springs, rocks and spikes.

wip

Work in progress

To render for the dome,we had to use Blender’s fish eye camera. We also had to take into account the angle of the screen, which means we have to angle the camera roughly 25º down.

sonic test0205.png

One of the frames from my animation, rendered in Fisheye.

Because the Dome is truncated, we had to edit it to fit in After Effects to crop 20% off the top. We also took this as an opportunity to add transitions and audio to make our work flow better.


My part of the animation, ready to be put together with the rest in After Effects ready for our presentation.