AI & Deep Learning
Virtualitics is a cross-platform application that merges AI, Big Data & AR/VR. The program features deep learning to transform big data into easily understandable reports and data visualisations within a shared virtual office, helping companies to grow. (WIRE, 2017)
Typically, analysing big data is no easy task. When using large amounts of data, even with visualisation technology, it can be difficult to pick out useful information. The Virtualitics platform uses AI to manage this, by means of algorithms that determine which metrics matter depending on what you are most interested in learning from that data. (Siegel, 2017)
The Virtualitics platform acts as a base for presenting and analyzing big data, and can allow for up to 10 dimensions of data to be shared, giving companies a competitive edge. (Takahashi, 2017)
The platform could be applied to many different applications, ranging from industries such as Universities or Hospitals, and has already been successfully applied to finance and scientific research applications. (Team, 2017)
- Highly interactive environment
- Can be used in multiple business applications and settings
- Makes big data accessible to everyone – even those who are untrained can easily access data.
- Simple and easy to use, automatically turns data into useful graphs based on what you want to learn from it.
- 3D VR office space may not be be appropriate for all applications.
- VR headsets can be expensive – If the platform requires multiple headsets (such as the shared office space) this could end up being quite costly for a company.
Ultrahaptics is a startup company based around allowing users to feel virtual objects in a physical sense. By using ultrasonic projections and hand tracking, users can feel & interact with virtual environments, as well as feel real tactile feedback without the need for wearing or holding special equipment. (Ultrahaptics, 2017)
The system is built using an array of ultrasound emitters in conjunction with motion sensors. Haptic feedback is created by first defining a a space in which to model the acoustic field. Within this field, focus points are created, that have differing types & intensities of feedback. (Kevan, 2015) This can allow for users to use both hands simultaneously or to interact with multiple objects.(Kahn, 2016)
- Highly Interactive – encourages user engagement
- Can be used in multiple applications
- Could make other AR and VR apps more immersive when used together
- All in one development kit, tools and support.
- Possibility to create multiple “objects” within 3D space.
- In certain applications, physical buttons could be more appropriate
- Users can still “push through” objects – they can be felt, but are not solid.
- The platform can (and does!) create noise and vibrations, whilst work is being done to minimize this, it will most likely always be present.
Whilst this sort of technology is still in its infancy, it offers a promising insight into the future of interactive technologies. In future, it could be applied to uses such as 3D sculpt modelling and similar applications, or making much more immersive VR and AR experiences.
Digilens combines AR and holographic technologies. They build AR screens for use in multiple applications, including inside car windshields and in aeroplanes. These screens can display real-time data, enhancing driver awareness and safety. (DigiLens, Inc., 2017)
- Fully customisable displays.
- Wide range of uses, both commercial and private.
- Can enhance driver awareness & Road safety
- Less bulky than tradition displays
- Could be distracting for drivers by taking their view away from the road
- Cost of building and adding to cars
The Workers, 2014
After Dark is an exhibition piece built using Raspberry pi. It allows viewers to take control of & drive a robot, exploring the exhibitions of TATE Britain via live video feed after closing time. (Afterdark.io, 2014)
It was created as a way to engage new audiences in art; allowing them to explore the exhibitions without even having to set foot inside the building. Whilst viewers were driving the robots, art experts provided live commentary, providing new insights and engagement into the pieces on display. (The Workers, 2014)
The robots were fitted with cameras and lighting, as well as sensors to ensure they could navigate the galleries without complication. (Tate, 2014)
- Highly Interactive – encourages user engagement.
- Acts as a platform for learning and exploration.
- Live art expert commentary makes the experience more than just “driving a robot”.
- Could be costly to build & run
- Battery powered robots – battery life is always a concern, particularly when these robots are connected to the internet and streaming for multiple hours.
- Special measures must be taken to ensure damage to museum exhibits doesn’t happen.
Whilst this is an interesting idea, it is important to note that virtual museum tours already exist (such as video tours or even VR tours, which also sometimes provide commentary), and the act of driving the robot could be considered nothing more than a gimmick.
Zach Gage, 2016
Glaciers is an installation piece built using 40 Raspberry Pi systems, exploring the interactions between digital platforms(In this case search engines) and humans. They are programmed to take the top 3 autocomplete suggestions that follow various phrases, and display them on a screen, creating odd poetry that reflects the nature of the modern age. (Bate, 2016)
Although the screens appear static, the phrases are updated once a day based on the most popular auto-completes. Due to the nature of this, the poems could change daily, but are unlikely to. (Gage, 2016)
- Relatively cheap and simple to build – Technology behind it is relatively cheap and easy to come by.
- Simplistic nature
- Concept understandable to most viewers
- Not interactive
- Due to the nature of Google autocomplete, poems do not change often (sometimes not at all)
Further images of Glaciers can be seen here.
Virtualitics (2017). Virtualitics office space. [image] Available at: https://www.virtualitics.com/ [Accessed 28 Oct. 2017].
WIRE, B. (2017). Virtualitics Launches as First Platform to Merge Artificial Intelligence, Big Data and Virtual/Augmented Reality. [online] Businesswire.com. Available at: http://www.businesswire.com/news/home/20170126006150/en/Virtualitics-Launches-Platform-Merge-Artificial-Intelligence-Big [Accessed 28 Oct. 2017].
Virtualitics (2017). Virtualitics [online] Available at: https://www.virtualitics.com/ [Accessed 28 Oct. 2017].
Siegel, J. (2017). How this Pasadena startup is using VR and machine learning to help companies analyze data. [online] Built In Los Angeles. Available at: http://www.builtinla.com/2017/05/05/virtualitics-vr-data-science [Accessed 28 Oct. 2017].
Takahashi, D. (2017). VR analytics startup Virtualitics raises $4.4 million. [online] VentureBeat. Available at: https://venturebeat.com/2017/04/10/vr-analytics-startup-virtualitics-raises-4-4-million/ [Accessed 28 Oct. 2017].
Team, E. (2017). Virtualitics: Caltech & NASA Scientists Build VR/AR Analytics Platform using AI & Machine Learning – insideBIGDATA. [online] insideBIGDATA. Available at: https://insidebigdata.com/2017/08/05/virtualitics-caltech-nasa-scientists-build-vrar-analytics-platform-using-ai-machine-learning/ [Accessed 28 Oct. 2017].
South West Business (2016). Ultrahaptics development kit. [image] Available at: http://www.southwestbusiness.co.uk/regions/bristol/meteoric-rise-of-ultrahaptics-continues-as-bristol-firms-incredible-touchless-tech-is-about-to-go-mainstream-07122016090019/ [Accessed 28 Oct. 2017].
Ultrahaptics. (2017). Ultrahaptics – A remarkable connection with technology. [online] Available at: https://www.ultrahaptics.com/ [Accessed 28 Oct. 2017].
Ultrahaptics (2015). Ultrahaptics diagram. [image] Available at: http://electronics360.globalspec.com/article/5907/touch-control-with-feeling [Accessed 28 Oct. 2017].
Kevan, T. (2015). Touch Control with Feeling | Electronics360. [online] Electronics360.globalspec.com. Available at: http://electronics360.globalspec.com/article/5907/touch-control-with-feeling [Accessed 28 Oct. 2017].
Kahn, J. (2016). Meet the Man Who Made Virtual Reality ‘Feel’ More Real. [online] Bloomberg.com. Available at: https://www.bloomberg.com/news/features/2016-02-03/uk-startup-ultrahaptics-is-making-virtual-reality-feel-more-real [Accessed 28 Oct. 2017].
Digilens (2017). Digilens car HUD. [image] Available at: http://www.digilens.com/products/autohud/ [Accessed 29 Oct. 2017].
DigiLens, Inc. (2017). Home – DigiLens, Inc.. [online] Available at: http://www.digilens.com/ [Accessed 29 Oct. 2017].
The Workers (2014). After Dark Robot. [image] Available at: https://theworkers.net/after-dark/ [Accessed 28 Oct. 2017].
Afterdark.io. (2014). After Dark. [online] Available at: http://www.afterdark.io/ [Accessed 28 Oct. 2017].
The Workers. (2014). The Workers: After Dark. [online] Available at: https://theworkers.net/after-dark/ [Accessed 28 Oct. 2017].
Tate. (2014). IK Prize 2014: After Dark – Special Event at Tate Britain | Tate. [online] Available at: http://www.tate.org.uk/whats-on/tate-britain/special-event/ik-prize-2014-after-dark [Accessed 28 Oct. 2017].
Gage, Z. (2016). Installation View. [image] Available at: http://www.postmastersart.com/archive/gage16/install1.html [Accessed 28 Oct. 2017].
Bate, A. (2016). Using Raspberry Pi to Create Poetry. [online] Raspberry Pi. Available at: https://www.raspberrypi.org/blog/autocomplete-poetry/ [Accessed 28 Oct. 2017].
Gage, Z. (2016). ZACH GAGE – Glaciers @ Postmasters: March 25 – May 7, 2016. [online] Postmastersart.com. Available at: http://www.postmastersart.com/archive/gage16/gage16direct.html [Accessed 28 Oct. 2017].
Gage, Z. (2017). He Says. [image] Available at: http://www.postmastersart.com/archive/gage16/hesays.html [Accessed 28 Oct. 2017].