3D Modelling for the Immersive Vision Theatre

We were tasked with creating a 10 second animation in Blender for the Immersive Vision Theatre. We had roughly 2 weeks to create it, so we had to keep it simple yet still demonstrate our knowledge of Blender and 3D modelling.

We all decided to create a different game each. Meg (View her website Here!) created Space Chess, Jack made Space invaders, Rachel made Pac-Man, Harry made Tetris, James made Monopoly, and I made first-person Sonic, as I felt this would work well in the IVT.

I decided to go for a low-poly aesthetic as we had limited time, so I needed to keep it simple whilst still making it look good. Having previous experience of Blender, I was able to easily create assets for my animation such as springs, rocks and spikes.

wip

Work in progress

To render for the dome,we had to use Blender’s fish eye camera. We also had to take into account the angle of the screen, which means we have to angle the camera roughly 25º down.

sonic test0205.png

One of the frames from my animation, rendered in Fisheye.

Because the Dome is truncated, we had to edit it to fit in After Effects to crop 20% off the top. We also took this as an opportunity to add transitions and audio to make our work flow better.


My part of the animation, ready to be put together with the rest in After Effects ready for our presentation.

Advertisements

Combining Art & Technology: The Painting Robot

Our Latest project, dubbed ‘Pip3tt3’, involved creating a robot that takes virtual data and represents it in the physical world, using a Raspberry Pi and/or Arduino

In this instance, three of us came together to create three different methods – and a different colour for each; Blue for tweets to #pipblue on Twitter (Meg’s part), Green for social media notification noises (Jack’s part), and purple for live webpage hits (My part).

IMG_20160225_181250

My part of the ‘robot’ used live web hits on my server web page to tell the robot to paint. To do this, I used a Raspberry Pi with Apache and PHP installed. I used a backwards SSH tunnel from my server that activated when a script was run on my Pi.

IMG_20160224_191344

The first prototype using a relay switch and small solenoid – which turned out to be too small!

To get the messages from point A to point B I used Beanstalk – which has the capabilities to take on multiple ‘jobs’, so it could deliver messages to multiple different Arduinos, however, in this case, it only needs to tell one to switch on!

pi gotjob

Using Beanstalk to send jobs via the Raspberry Pi.

The Arduino part, however, uses the same code as the other 3 parts of the robot. It takes an input, in this case ‘on’, which tells it to use the attached solenoid to release some paint.

IMG_20160225_180834

The wiring of the robot is actually quite simple – as seen above – it uses one 9v battery, a relay switch, and a 12v 1kg force solenoid (smaller ones were too weak to squeeze the pipette!). It is housed neatly inside a wooden box with a 3D printed holder for the pipettes.

 

pi cam

The Pi Camera, which we used to record the painting process.

IMG_20160226_115548

First Tester painting created by the robot

IMG_20160226_181224

Final painting created by the robot during presentation