Netscapes: Insight – Final Presentation

Insight: The Big Five is an interactive artwork inspired by psychology and personality. Users are invited to take a short personality test on the in-built touch screen, and see their results displayed in real-time within the Immersive Vision Theatre (IVT). The glass head also reacts to inputted data with a variety of lighting effects.

IMG_20180123_161751.jpg

User inputs

IMG_20180123_162816.jpg

User input with changing visuals

insightfinalpres.gif

User inputs

 

 

Advertisements

Netscapes: Insight – IVT Testing

Today we did our final build and first live test in the Immersive Vision Theatre (IVT). We started by fitting the Raspberry Pi and touchscreen inside the plinth, then transporting the equipment to the dome ready for our presentation.

IMG_20180122_160933

Fitting Pi3 + Touchscreen

Chris added wooden beams to support the weight of the Pi, as it will be under a lot of pressure when the touchscreen is in use. This should prevent the touchscreen moving away from the pinth.

IMG_20180122_150137.jpg

Setting up in the IVT – Modifying Code

Whilst in the IVT, Gintare updated her code to work better within the shape of the screen. She moved some of the key elements of the visuals so they were more centered within the dome, bringing them to the viewer’s attention.

 

vizlaptop.png

Setting up the visualization

We transported the physical part of our project to the IVT and decided where to set it up. We then tested the project within the space to understand how it will look and feel to the viewers and how the colours will display in the dome.

head interface.png

Glass head with touchscreen interface

We took this as an opportunity to double-check our database connections were working. During this time we ran into issues with page refreshing (which I quickly resolved) and with internet connection, which we resolved by using a mobile access point.

headdemo.png

Glass head interface in front of the projection.

We even invited Luke to test out our user interface, and have a go at inputting his own data into the visualization!

head interaction.png

Luke testing out the user interface!

dometest

Head test with visualization within the dome.

Netscapes: Building Bluetooth Connections – Part 2

Today we had access to the physical side of the project, so I tested my Bluetooth code (see my previous post) with the Arduino side. Luckily, after pairing with the HC-05 Bluetooth component, the code worked first time without need for debugging!

IMG_20180122_160927

The Arduino side, with HC-05 Bluetooth component & Neopixel ring

Chris and I modified the Arduino code to output different lighting effects based on the character sent across Bluetooth. We decided on the default being Red, with a breathing effect (which I created for a previous project) and a rainbow spin effect.

LEDbluetooth

Bluetooth message sent on tapping “Generate”

How it works

  • When the local server is started, it searches through paired devices to find the HC-05 module.
  • When it is found, it opens a connection and sends it the instruction to turn on.
  • When the generate button is pressed, a new message is sent across the connection instructing it to run the rainbow effect.

Critical analysis/Reflection

To begin with, we were going to use a separate mobile app to input user data across Bluetooth to the Arduino. Switching instead to using the same input as the user data adds a level of interactivity than we would have previously had from a separate phone app. It allows a user to instantly see the effect their inputs have had even before the visualization updates.

This also ties the piece together better, making it an all-in-one system rather than being split up.

Future Improvements

If we had more time, I would modify the code to react differently depending on some of the user inputted data, such as changing colours or effects based on values.

 

 

Netscapes: Building Bluetooth connections

To bring together the visualisation and physical prototype, I started working on a Bluetooth connection to the MongoDB connection code I previously built.

IMG_20180113_141644

Physical prototype with HC-05 Bluetooth module

Since we already have the HC-05 Bluetooth module in place and working with the Bluetooth terminal input on mobile, I simply had to look up how to create an output system in our .js code to match the inputs we previously designed for the Arduino.

BSP design.jpg

Initial flow diagram of program

I looked into how this could be done and began researching into using Bluetooth-Serial-Port module for Node.js.

After getting to grips with how the library works, I experimented with creating a basic framework for opening a Bluetooth connection and sending a basic input.  This code will check for a connection with the correct name, find the matching address, open a connection, and if it is successful, and the character ‘a’. When hooked up to the glass head model, this should activate the LED ring, making it light up.

bluetooth serial build code

My experimentation with BSP within the previously made MongoDB connection code

 


Issues

  • Certain information missing from Bluetooth-Serial-Port NPM documentation – I had to work around this by searching for other uses of BSP to fill in the gaps
  • Method to call previously paired Bluetooth devices doesn’t work on linux systems, so a workaround has to be made (looping through available connections and matching a name)

Next Steps

  • Update Arduino-side code: Modify existing code to include more interesting light effects, such as those I previously created for my ‘Everyware’ project. These would not be direct copies, but modifications of this pre-existing code, for a unique lighting effect.
  • Thoroughly test this code to ensure a secure connection is made and maintained for the duration of the installation.

Code Referencing/Libraries Used

Below is a list of the code documentations I used as reference when building my code. Whilst code was not directly copied, it was heavily referenced from the documentation:

bluetooth-serial-port: https://www.npmjs.com/package/bluetooth-serial-port
JS express – https://expressjs.com/en/guide/routing.html
JS json body parser – https://www.npmjs.com/package/body-parser-json
JS path – https://nodejs.org/api/path.html
JS Mongo Client – https://mongodb.github.io/node-mongodb-native/api-generated/mongoclient.html

Netscapes: Making & MLabs

Today we worked further on bringing the project together, drawing together all our current work and making improvements where necessary.

MLabs/Visualization connection

I worked on building a connection to the mLab database, pulling data and using them for parameters for a circle. The code checks the database for a new entry every 15 seconds.

readdb

Reading values from Database

For example, I set up mapping for sliders to RGB: The slider takes a value of 0 to 8 for the user, which is mapped to a number between 0 and 255 for 3 of the values (in this case the vars kind, trust and help). I also applied this to the radius and speed of movement.

Next, Gintaré and Chris will take this to build into their visualisation in its current state.

User Interface Modifications

We then looked at Gintaré’s slider inputs and how they would look in the physical build.

IMG_20180116_114315

First slider test in plinth (without the glass head or diffuser)

After reviewing both its looks and ease of interaction, we decided to make a few changes, such as making the text/scrollbar larger and removing the numbers from the sliders (As they do not display properly on Raspberry Pi).

Gintaré made modifications based on these observations and we quickly reviewed it. We also decided to colour code each section of sliders to each section of the CANOE model. This not only breaks it up but makes it more visually appealing in a way that makes sense.

IMG_20180116_120335

Touchscreen with enlarged scroll bar for ease of use.

We decided it would still be best to display the touchscreen with the stylus for ease of use as the sliders can still be difficult to use at this size.

IMG_20180116_123645

Touch screen with colour coded sections (per canoe model)

Since the touchscreen has no enabled right-click function, once the app is full-screen it is very difficult to get out of – meaning the viewers won’t be able to (intentionally or accidentally!) exit it.

We decided to bevel the edges that surround the screen as they make it difficult for users to easily reach the screen. This will also make it look more inviting to a user by bringing it into their view.

Connecting MongoDB/mLab to front-end

I started working on code to input values to the database using Gintaré’s previously made slider interface. This was built using express, npm and node.js. On recommendation from Chris B, Express was used in place of PHP.

When run, the code hosts the necessary files (such as Gintaré’s sliders) on a local server, which sends the data to the remote server when “Generate” is pressed.

 

Since Node.js means the code is ‘modular’, we decided to put the login details in a separate .js file (rather than censor the mongoDB login details when on GitHub)

IMG_20180116_171936

Installing Node.js & npm to Raspberry Pi

Once this was up and running (and confirmed to work on mLab), I moved the files and installed the necessary npm packages on my Raspberry Pi. I then tested the connection to mLab to ensure the data was working.

IMG_20180116_185022

Running the local server (Hosting the sliders form) on Raspberry Pi

We then put this server connection together with Gintaré’s updated user interface.

data canoe test

Data inserted into mLab via Raspberry Pi

mlabs multi canoe

Multiple documents in MongoDB database.

Now that we have data both coming into and out of the database, we are ready to move onto the next steps!

Next Steps

  • Finish Visualization
  • Put together final physical prototype (Seat raspi, sort out power supplies .etc)
  • Preview in IVT – test visualisations before presentation
  • (If time allows) Make a system for colour of head based on last data entry.

Netscapes: Building – MongoDB & Python

This week I have focused on the building stage. After helping my team members get started with p5.js, I got to work building my parts of the project: the back-end and LED control.


Emotion/colour Sliders – Python & Arduino LED control

Part of our project includes the representation of emotions in a visual sense. We decided on creating this using a basic slider input, so I got to work developing it.

I built this using:

  • Raspberry Pi 3
  • Arduino
  • 5″ GPIO touchscreen
  • Python

I created my app using libraries including PySerial (for serial connections using Python) and tkinter (For rapid production of basic user interfaces). I decided to use Python as I have previous experience with creating connections to Arduino using PySerial.

Building circuits

Firstly, I setup the Raspberry Pi 3 with the necessary drivers & fitted the touchscreen. I created a script on the desktop to open an on-screen keyboard (so I wouldn’t have to use a keyboard for setup later). I then built a basic circuit with an RGB LED and hooked it up to the Raspberry Pi.

IMG_20171208_003440 (1)

My Rasberry Pi with GPIO touchscreen.

Programming

I started off by building a basic slider interface using tkinter and Python. I made sure it was appropriately scaled to the screen and then worked out how to get the output of the slider in real time. In this case, I used 1 to 8 to match our data input app.

IMG_20171210_195700

Basic RGB LED wiring

Once the slider was working correctly, I set up a basic serial connection to the arduino using PySerial. Since PySerial needs data to be sent in bytes, I had to make sure the characters sent were encoded. I then built a basic script on the Arduino side to receive the characters and change the colour of the RGB LED based on how far the slider was moved (in this case blue to yellow for sad to happy).

Link to my code on GitHub: https://github.com/Mustang601/Arduino-Serial-LED-slider 

Sequence 01_1

My completed LED slider

My next steps are to further develop the user interface, and to figure out how to use this in conjunction with the other user inputs (for database connection).


MongoDB

I created a database in MongoDB, and hosted it on mLabs (due to a software conflict I couldn’t easily host it on my own server, so this was the next best thing!)

The database will hold all the input data from our app; and will be used in the creation of our visualization.

mongo input document

Document within MongoDB database

The next step is to connect this database up to the input app and visualization once they are both completed.


Related Links

tkinter: https://docs.python.org/3.0/library/tkinter.html 

PySerial: https://pythonhosted.org/pyserial/

mLabs: https://mlab.com/

MongoDB: https://www.mongodb.com/

Netscapes: Creation and Coding

Now we are moving into the development stage of our project, I have started setting up the server-side technologies we are planning to use. Here is an overview of the resources we plan on using plus an update of  my work so far.


Resources we are using

Here are the hardware and software resources we are using for our project:

  • MongoDB – in conjuction with PDO/PHP – for database operations
  • P5.js (with HTML/CSS/js .etc) – for visualizations
  • GitHub – for easy collaboration and version control
  • My server/website – for hosting .etc
  • Raspberry Pi & Touch input – for user input
  • Projector & Projection mapping software – such as surface mapper GUI

Setting up the Server/Database

I installed MongoDB to my server ready for us to use.I designed how it would work in relation to the data we will store and how we will store it. The database and other parts of the project will be hosted here.

Database Design

earlyflow

Sketch of Database connection flow in relation to Raspberry Pi and IVT visualization

netscapes_diagram
A flowchart I created to outline how our project will operate in relation to the server/Database.

Functional & Non-functional requirements:

Functional:

  • Input of user data from input page/app
  • Storage and retrieval of data for use in visualization

Non-Functional:

  • Security of database – PDO (PHP data objects) is used as it uses prepared statements, which makes it more secure against injection attacks. (Stackoverflow.com, 2017)
  • Security of server – a separate username and password are used for database access only, hidden from viewing by site visitors.

Sample Content of Database

Below is an outline of how our database will work, where “options” relates to the sliders within the app.

db structure

MongoDB Document structure, created by Chris (https://enoodl.com)

Primary Key Name Option 1 Option 2 Option 3 Option 4
1 John Doe 1 6 5 3
2 Jane Doe 8 9 6 7
3 Jill Doe 4 2 8 4

 

After designing I created a database with necessary tables ready to store the data once the app becomes live.

Since MongoDB works with PDO, I chose to use this for our project as I have experience working with it in the past. This is beneficial for added security against injection attacks and flexibility in case of moving to another database. (Stackoverflow.com, 2017)


GitHub

We are using GitHub for easy collaboration and version control. Since a lot of my work is on the server side, not much will be seen on GitHub other than adding my contributions (such as server connections) to my other group member’s code.

My GitHub can be found here: https://github.com/Mustang601/


Links

My GitHub: https://github.com/Mustang601 
My Website: http://mustangphoto.co.uk/
MongoDB: https://www.mongodb.com/
P5.js: https://p5js.org/
PDO: http://php.net/manual/en/book.pdo.php
SurfaceMapperGUI: http://jason-webb.info/2013/11/surfacemappergui-a-simple-processing-interface-for-projection-mapping/


Sources

Stackoverflow.com. (2017). In PHP, how does PDO protect from SQL injections? How do prepared statements work?. [online] Available at: https://stackoverflow.com/questions/4042843/in-php-how-does-pdo-protect-from-sql-injections-how-do-prepared-statements-wor [Accessed 3 Dec. 2017].

Netscapes – Project Planning – Technologies

After developing our idea further, here are the technologies we are planning to use to create our project:

MySQL/MongoDB
For storage of collected data from our app. SQL may be more appropriate for its ordered approach, but does not work easily with Phonegap.
MongoDB has plugins for Phonegap and may be considered a newer technology, thus more appropriate for our brief. (MongoDB, 2017)

Phonegap

We will use Phonegap to build a basic app containing input fields that users can download. Phonegap has a handy online compiler, meaning that apps can be built quickly that will work across multiple mobile OS’s. (PhoneGap, 2017)

Raspberry Pi

A Raspberry Pi will be used for input of data without the use of a mobile phone. This will be achieved by pairing it with a touchscreen hat. We will use this for presentation within the IVT.

P5.js

p5.js is a javascript library with drawing functionalities. We will use p5.js for creating interactive web-based visualizations with server connections. (P5js.org, 2017)

Immersive Vision Theatre (IVT)

We will use the Immersive vision theatre for the large scale presentation of our project. Users will be able to come in and view their data visualized in real time.


Sources:

MongoDB. (2017). MongoDB for GIANT Ideas. [online] Available at: https://www.mongodb.com/ [Accessed 25 Nov. 2017].

PhoneGap, A. (2017). PhoneGap. [online] Phonegap.com. Available at: https://phonegap.com/ [Accessed 25 Nov. 2017].

P5js.org. (2017). p5.js | home. [online] Available at: https://p5js.org/ [Accessed 25 Nov. 2017].

Netscapes – Planning & Production

Responsibilities breakdown

Here is the breakdown for our group responsibilities. This is subject to change as the project moves along.

My responsibilities are focused around programming & building the back end

Name
Tasks Chris Steph Gintare
  • Project Management
  • Tutor Mediations
  • UX Development
  • Research: Personalities
  • Research: Animations
  • 3D model for RasPi case
  • Immersive Dome Projections
  • Databases
  • Server
  • JavaScript
  • Sockets,
  • Animations Code
  • P5.JS
  • Node.JS
  • Immersive Dome Projection
  • Phonegap Front end
  • HTML/CSS for website
  • Visual Research
  • Javascript
  • P5.JS
  • Animations
  • Research: Animations
  • Immersive Dome Projection

Equipment Orders and collation, software/ hardware (Chris,Steph)
29th Nov 17

  • Source all production materials
  • Raspberry Pi 3’s
  • Touch screens
  • Finalise final form for interface container
  • OS systems
  • Server Hosting

Back End (Chris, Stephanie)
4th Dec 17

  • Find server to host MongoDB
  • Develop a MongoDB database to store parameter of user interface inputs
  • Connect simple html/ JS to database to check CRUD works
  • Client Side and interfaces (Gintare, Stephanie, Chris)

4th Dec 17

  • Simple PHP developed to ensure client is posting data externally/ correctly
  • Prototype phone gap front end for users to input personality data
  • Simple website to host visualisations.
  • Start to look at button interface to select background for the animations

Animations (Gintare, Chris)
11th December

  • P5.JS animations as prototype  forms with some or all parameters alterable though user interfacing.

Modelling for Interface controller (Chris)
18th December

  • A prototype of the interface container to be roughed out for both the primary and seconday input (buttons for scenery changes)

To be completed as working structure 9th Jan 2018


Time planning (group / individual)

Group Time Planning

Here is our group timetable, created by Chris.

netscapes group timeline

Group time schedule by Chris (enoodl.com)

 

Individual Time planning:

Below is my individual timeline for this project. Most parts of this timeline are subject to change as the project goes along; such as running into difficulties or in relation to other’s completion of tasks.

Blog updates marked with * may not be required.

Week Tasks
(27/11/17)
  • Blog Update
  • Decide on visualizations
  • Set up webpage for visualizations.
  • Set up server for database (Installation)
  • Begin to source equipment
  • Begin visualization development.
(04/12/17)
  • Blog Update
  • Further visualization development
  • Create database
  • create website inputs
  • begin to build connection between visualization/app/Database
(11/12/17)
  • Blog Update
  • Finish linking Database to app & visualization
  • Finish visualization
(18/12/17 to 01/12/17) – Christmas Break
  • Blog Update*
  • Finalization of project
(08/01/18)
  • Blog Update
  • Presentation of project (IVT)
(15/01/18) – Final 2 days
  • Blog Update*
  • Finalization of documentation
  • Refinement & finalization of submission
DEADLINE: 16/01/18
  • Submit by 23:59

Resources list & Budget

Physical Resources:

  • Raspberry Pi (Supplied by me)
  • Touchscreen for RasPi (Supplied by me)
  • 3D printer & related equipment (Supplied by university)

Budget

Our budget will have to cover building supplies as well as Arduino components. Our maximum budget is around £100 but we would like to stay below this if at all possible.

Netscapes: Arduino & Raspberry Pi – Lee Nutbean Workshop

Today we had a session with Lee Nutbean, who gave us insight into his work with Raspberry Pi and Arduino for multiple products and projects. He talked about the different types of Arduino boards available, what they are useful for, and how he has used them in the past for his own professional projects.

Race
by Lee Nutbean

lee-nutbean

Race by Lee Nutbean (Art in Odd Places, 2017)

Race is an installation piece consisting of an array of LEDs mounted on a board, designed to be hung in windows and moved around. The piece is connected to social media, and the lights will only turn off when mentions of the word “race” cease. The plus symbol flashes whilst it is connected to the internet, checking for the word “race” in a human categorization context. (Estes, 2017)

Prototyping

We looked at all the different ways you could build a single product. We did this by looking at a piece he had previously made, and planning out how we would have made it ourselves, such as what technologies we would have used, how they would work together and why we chose the technologies we did.

There are multiple ways these products could be created; some easier and some more difficult, but all valid options.

Making your own Arduino

Whilst the Arduino is great for rapid prototyping, experimentation and building, it isn’t so great for a finished product.

We looked at how you could either create your own Arduino, by either building and programming one yourself or getting a board printed.

Whilst building with an Arduino can offer many benefits, building your own can be cheaper, smaller, and more specialised to the task it was built for.


Sources

Art in Odd Places (2017). Race by Lee Nutbean. [image] Available at: http://race.artinoddplaces.org/artists/nutbean-lee/ [Accessed 18 Nov. 2017].

Estes, C. (2017). Art in Odd Places | 2016 RACE. [online] Race.artinoddplaces.org. Available at: http://race.artinoddplaces.org/artists/nutbean-lee/ [Accessed 18 Nov. 2017].