Reality Computing 16-456
Professor: Tom Corbett | Fall 2017
This course is a collaboration taught by John Folan and Tom Corbett between Carnegie Mellon's Integrative Design, Arts, and Technology Interdisciplinary Program and School of Architecture's Urban Design Build Studio on Project Re_Con. UDBS is a year-long architecture studio composed of Bachelor of Architecture, Master of Architecture, and Master of Architecture Engineering and Construction Management students that design, prototype, and construct low income housing projects within Pittsburgh.
The objective of Reality Computing was to aid UDBS's efforts by developing augmented reality and virtual reality communication/interaction tools for later implementation within a mobile home incubator space.
The reality computing team co-produced the interaction and experience design of the VR/AR experience through prototyping and development. All of us actively worked towards providing immersive experiences for the residents to involve them within the design process in hopes of developing trust within the community towards these new developments. The purpose of utilizing VR/AR was to help the general public better visualize the housing project completed and integrated into the fabric of Pittsburgh.
Sources: Pittsburgh Archieve, Healthy Ride, US Census, etc.
Tools: ESRI ARCGis, D3.js
Reality Computing Team: Marisa Lu, Vikas Yadav, Cody Sokas, Lexi Yan
UDBS Team: Ankur Dobriyal, Yoonho Oh, Cassidy Rush, Jacob Clare, Jay Tyan
We were tasked with gathering relevant information and ARCGIS data sets to best assess three different potential sites. Our group was assigned 45 Carver Street in the Larimer neighborhood of Pittsburgh. Our objective was to understand the life of past and current residents as well as to extrapolate future forecasting. We were tasked to prepare to present relevant findings to developers, investors, residents, and backers.
Hardware: Rico 360 Cam, LIDAR Faro
Software: Unity, Rhino, Sketchup, Revit, Autodesk Excap, etc.
Next on the agenda was to build a VR tool to allow the designing architecture students and developers to see the designs in context to better understand spatial, visual, and environmental factors (e.g., sunlight through the neighborhood, proportional relationships, surrounding context, etc). Along the process, we used and experimented with various technologies to refine a pipeline for generating VR environment. Next on the agenda was to build a VR tool to allow the designing architecture students and developers to see the designs in context to better understand spatial, visual, and environmental factors (e.g., sunlight through the neighborhood, proportional relationships, surrounding context, etc). Along the process, we used and experimented with various technologies to refine a pipeline for generating VR environment.
One of the East Liberty Development Managers walking through the neighborhood with the proposed architectural changes
VR, HTC VIVE
HOME INCUBATOR EXPERIENCE
Skills: Critical Thinking, Rapid Prototyping
Software: Unity, AR kit + core, Vuforia, A-Frame
The purpose of the home incubator is to travel through various neighborhood of Pittsburgh week by week to expose as many residents to our proposal as possible. The purpose of the VR interaction is to give visitors a sense of what the site and building proposal would look like, even if they aren't physically standing in the neighborhood. Our job is to curate the multiple digital interactions experienced in the Home Incubator to create the best user experience possible while still conveying the necessary information.
Initial mobile incubator grant propo
Architecture students' efforts towards affordable housing
Discussion with developers and community leaders. Getting to know residents through interviews with direct feedback from neighborhood leaders.
After reaching a certain point of development with the VR models for the individual sites, a final site was chosen by Professor Folan. After collective planning, the Reality Computing cohort decided to split into four groups consisting of Interaction, VR, AR, and projection mapping. My role, as someone both in the AR and Interactions group, was to bridge the two groups by maintaining the visual consistency and specializing in the interactions for the AR component.
Overall of entire process
ONBOARDING WITH A CHATBOT
If you would like to read Vikas Yadav' Medium post explaining what he and Marisa Lu did with the onboarding, click here.
Why a Chatbot? The idea was a synthesis of multiple goals: on-boarding, data collection and managing incubator activities. Collecting data on individual buyer helps craft a personalized experience both within the chatbot(suggestions and financial consultation services) and during the visit to incubator. More importantly a chatbot allows us to present with a buyer 24x7, just in time. The team paid close attention to crafting conversations which felt more humane instead of traditional bot tonality. The chatbox is expected to be represented as an extension of the Re_Con team, almost making it feel like a member a chatting with the buyer. On the technical front, a chatbot can afford plethora of functionalities. Like in case of Re_Con, chatbot affords conversations around personal data collection as well as built in AR Scanning capability for tangible takeaway.
Understanding implications — Chatbots can be effective communication tools, but there is a limit to how far this chatbot can assist potential buyers. Somewhere along the way, there will have to be humans representing Re_Con. Combining chatbot with the incubator makes a compelling case for a well balanced communication and outreach strategy for recon.
When a buyer visits the incubator, we expect to take them through a neighborhood primer through projection mapping. In this stage we expect to communicate the historical, geographical and socio-economic makeup of the neighborhood. Expectation is to communicate the value of a given neighborhood before they checkout house details.
VR Team: Anna Gusman, Monica Huang, Scott Leinweber, Soonho Kwon
EXPLORING THE NEIGHBORHOOD THROUGH VIRTUAL REALITY
After neighborhoods primer, we expect to take buyers through neighborhood VR experience. Our VR swat team has been crafting some really high quality VR environments for past coupe of weeks. Plan is to use Valve’s Vive VR headset with touch controllers to immerse buyers/users in these environments. We intend to augment those environments to communicate immediate neighborhood by calling adjacent houses and related information. Building on top of that, we also intend to communicate facilities, services and amenities around the site. Such features are effective characteristics which impart value to a certain neighborhood and thus make it more livable.
AUGMENTED REALITY USED IN THE INCUBATOR
To read our documentation of the process, click here.
AR Team: Marisa Lu, Aisha Dev, and Lexi Yan
Why AR? Prospective house buyers don’t make decisions or understand what a house might feel like from just looking at static floor plans or drawn elevations, or even high fidelity renderings. There is no actual house that they can physically tour. How do you communicate spatial, location tied information? AR has a good chance at tackling that.
How might one use AR to give a sense of space and scale? Of other relevant information tied to objects in space? Onsite, the literal 1:1 scale floor plan can be delineated and a digital 3D model materialized to scale and ‘placed’ in real space on top of the outlined plan with augmented reality.
“One thing we’ve seen clearly is that AR is most powerful when it’s tightly coupled to the real world,
and the more precisely the better,” said Clay Bavor, speaking at Google’s I/O
Since our role was more to explore and do early prototyping, most of our efforts concentrated on researching AR tools, evaluating the limitations and strengths of each option, and exploring how AR would/should be implemented on site.
UX considerations needed for the AR onsite ‘tour’ include
App start and model loading
Ensuring accurate scale
Recovery and recalibration
If you continue with ARKit, there are native iOS ui assets that work will with the SDK
Sketch (the vector based popular industry standard for UI designers) has iOS assets to start off with
Technical Brief: Full Keynote Presentation
While developer accounts and licenses should not be an issue, other resources might be. With a design that uses ARKit or ARCore, the hardware (iPhone 6s + or Google Pixel/Galaxy S8) is expensive enough we can’t assume residents and visitors would have it to load the app. Tablets/display phones would have to be provided.
User Experience On Site
Physical Considerations — With 5–8 people per session, we had to take into consideration the physical implications of efficiently getting everyone started on the iPads and moving through the space without hindering the experience of the other users. This experience will have to be tailored to the final form of the incubator since it will affect the flow of traffic and available square footage for the individual virtual experiences. If the AR experience is to be outdoors, the existing site conditions will have to be factored in such as terrain, rocks, trees, bushes, sidewalks, and other barriers within the space. These will not be hindrances if the site can be cleared/altered for ease of use while the incubator is on the site.
Digital Considerations — We wanted to make the navigation and options as simple as possible while being easy to use. When designing for AR, keeping it simple is vital with unobtrusive hints. Hence, we decided to implement visual tutorial to instruct people on what they can alter. The secondary options (previous versions, energy saving options, and cost analysis) along with a help icon are located in each of the four corners of the iPad. This was to make them easily visible to the user, simple to click, and lowers the risk of clicking a neighboring icon.
To read our full documentation of the process, click here.
Team: Aisha Dev and Lexi Yan
We were tasked to design the design the Interface for the iPad AR Application to be used in the incubator while Marisa continued to explore the capabilities of AR. Our main goals were to help users realize what the house would look like if they bought and furnished it, evaluate the cost of their decisions to know if they're making affordable decisions, to see previous visitors' customizations so they are able to see how the design shifts with every new visitor, and to become more educated on energy saving options that can help make utilities much more affordable.
USER INTERFACE OF AR PLATFORM
AUGMENTED REALITY EXPLORATION
Tangible Takeaway with Augmented Reality
I was tasked with designing the tangible takeaway portion of the AR experience. With the help of Marisa (top right testing on a larger print out), we were able to prototype the AR marker for the cover of the handout.
- What was the purpose of the tangible takeaway? The tangible takeaway is intended to bring your on site experience home. After curating the inside of your new Augmented Reality home, you'll be prompted to save your customization and a handout will be printed with your custom AR marker on the front that can be used with a phone or webcam. Inside the booklet is information about Project Re_Con, UDBS, the Incubator, and important issues like affordable housing and financial resources. Visitors will be able to take a part of their experience home to share with their friends and family.