We are proud to announce the completion of support for Visiting Assistant Professor of Biology Jessica Fellmeth’s Survey of Human Anatomy course. Throughout the semester students worked in groups and scheduled times to use the two Virtual Reality (VR) applications Organon and YOU by Sharecare to record themselves explaining sections of human anatomy. These virtual presentations furthered the student’s critical thinking and oral communication skills. Throughout the course, students reported that this immersive learning experience helped them process and retain the information that they were studying. Scott Paul, Help Desk and ITS Student Manager, used the HP OMEN laptop and the HTC VIVE to provide an in-class demonstration of the VR equipment. Afterwards the students worked with our Digital Media Interns and Tutors to use screen capture software to record their perceptive in the VR environments and produce presentations. Professor Fellmeth has expressed interest in exploring opportunities to utilize VR applications in future lessons and courses.
Below are brief examples and screen captures of student perspectives in two of the VR applications.
During the Spring 2018 semester, Visiting Assistant Professor of Literature Nhora Lucía Serrano incorporated virtual reality technologies and assignments into her interdisciplinary Literature 232 “Dream a Little Dream: Virtual Realities & Literature” course. The intermedia course, designed to introduce students to the representation of virtual worlds in literature, explored emerging technologies such as StoryMaps, 3D virtual objects, and 3D virtual realities. Taking an analytical and digital humanities methodological approach, Literature 232 sought to interrogate how literature can represent and problematize the paradigmatic relationship between the so-called real world and the dreamscape one. The course culminated with students creating virtual reality environments inspired by scenes and passages from the literary texts read throughout the semester. According to Prof. Serrano, this unique approach to teaching a humanities course with digital technology allowed students to be producers– not just consumers– of technology.
In order to teach students the basics of VR development, Serrano teamed up with Instructional Designer and 3d Technology Specialist Ben Salzman, who provided multiple workshops for the course, e.g. Unity, Tinkercad and Screen Recording Software. Throughout the semester, students utilized Salzman’s expertise, along with templates and assets, to create immersive environments inspired by one of the virtual worlds read about in class. Working in groups of three to four, the students developed and refined their skills in reading, critical thinking, creativity, and literary analysis via digital humanities, digital arts and technology.
This semester, our new Universal Laser Cutter was put to creative task. Julie Suk (Class of ‘18) worked along side Educational Technologist Bret Olsen to produce images that utilized the laser cutter, medium format photography, scanning, photoshop, and Illustrator. Julie initially explored the intricate cutting of the images by hand and with power tools, but couldn’t achieve the desired results without the use of laser cutting. After a large amount of experimentation and research, Julie and Bret landed on adhering her prints onto Sintra (PVC board similar to foam core board) before loading them into the cutter. This provided a clean precise backing for cutting her prints and the results are spectacular.
Julie’s senior thesis work will be on view till May 20th at the Ruth and Elmer Wellin Museum of Art. Visit her webpage to read her artist statement and see more images her work. This thesis project was also made possible with the support of the Steven Daniel Smallen Memorial Fund and the Art Department Fund for Seniors.
Judy Zhou ’19 has been selected for our Hamilton College Instructional Technology Apprenticeship Program (ITAP). The program, founded by members of the New York Six Liberal Arts Consortium (just link out– don’t list schools), provides paraprofessional experiences to students interested in instructional technology. Students are provided a multitude of experiential learning opportunities, including course support, pedagogy workshops, and networking.
Judy’s interest in the use of Virtual Reality (VR) in higher education prompted her to develop a unique project that models the “Walk of Privilege.” This activity, which works to raise awareness of social, economic, and cultural differences among students, has been proven to help participants recognize diversity within their community.
For those who are not familiar with the Walk of Privilege, the process begins with all participants lining up side-by-side in a large space (such a football field or gymnasium) . As a list of instructions is read by a facilitator, the students step forward or backward depending upon the relevance to their particular privilege or lack thereof. Here are examples of the questions that could be asked:
“If you can find Band-Aids at mainstream stores designed to blend in with or match your skin tone, take one step forward,” and “If you come from a single-parent household, take one step back.”
As the questions continue to be asked each student moves back and forth and at the end can see a visual representation of where their peers experience these challenges. The goal of Zhou’s project is to introduce this educational experience via VR on college campuses to promote self-reflection and empathy among students and the general community.
As part of our participation in the Building of the Campus of the Future project, Judy will be working with our new HP zWorkstation and the HTC VIVE to develop her VR environment.
After several planning meetings, Visiting Assistant Professor of Literature Nhora Serrano’s exciting and ambitious course has started! The course “introduces students to the representation of virtual worlds in literature, and how these ‘dreamscapes’ have transformed our understanding and experience of the ‘real.'” This positions modern day virtual reality devices within a historical and philosophical framework, dating back as far as the shadowy projections in Plato’s Allegory of Cave. The course goals include having students explore emerging technologies “in order to interrogate how the literary texts represent and problematize the paradigmatic relationship between the so-called real world and the dreamscape.”
Engaging with this, student will create virtual realities of their own, familiarizing themselves with the contemporary technology to do so. Instructional Designer and 3d Technology Specialist Ben Salzman will be helping the class work with a Unity template and assets to create, in groups of two or three students, an immersive environment (which viewers will be able to virtually walk through!) from one of the virtual worlds read about in class.
The course will build towards the final project all semester, with workshops starting in February.After trying out our HTC VIVE virtual reality headset and exploring a sample of the type of scene they will be able to create with Unity during this semester, the students will create a 3D virtual object from either Carroll’s Through the Looking Glass or Baum’s The Wizard of Oz. This individual project will serve as scaffolding, enabling students to get their hands into 3D design in an accessible, simple, and lower stakes way than their final project. The 3D virtual object will be made through some combination of 3D scanning, the easy-to-use browser-based 3D design app TinkerCAD, and virtual reality apps like MakeVR Pro or Medium that enable design work to be done in virtual reality instead of on a computer screen. Students will track their learning and display their virtual objects on a course blog.
Visiting Assistant Professor of Literature Andrew Rippeon has some exciting projects blending the 16th century technology of the letterpress with the cutting-edge technology of the laser cutter. The Resident Designer and Production Manager for Theatre Jeff Larson also attended the training, with an eye on how laser cutting can invigorate the Theatre Department’s creation of models, puppets, sets, props, and other design elements! Our laser cutter is currently housed in the Theatre Department’s costume shop, pending a future space for 3D and digital fabrication technologies.
Andrew Rippeon (Visiting Assistant Professor of Literature) peers into the laser cutter while it engraves a woodblock.
Scott Paul (Help Desk and LITS Student Manager), Bret Olsen (Educational Technologist), and Jeff Larson (Resident Designer and Production Manager) set up a file to demo the laser cutter.
We’ll be using Adobe Illustrator to design our files for laser cutting. We tested several materials, including paper, plastic, and wood. We experimented with differences between soft and hard wood; plywood proved to be difficult to cut through, since it is a composite of several wood types which would require different settings. As well as cutting into paper in individual sheets or small stacks, among our discoveries was the possibility of marking on and etching into paper with the laser cutter.
Test of using the laser cutter to create an icon of a hand for use with the letterpress. The icon was created from a scan of a broadside.
A test of the same icon on a different type of wood with less grain.
A test of laser cutting words into paper at various power levels, ranging from faintly etching the paper to cutting all the way through.
Watch our laser cutter in action, cutting letters into paper.
The book made by the first-year students in Visiting Assistant Professor of Literature Andrew Rippeon’s Unpacking My Library course is now viewable! Under Rippeon’s mentorship, students designed and wrote the book, used the letterpress lab to create the cover, and used augmented reality (AR) to link images and video into the experience of the book. Rippeon writes:
The book as a whole compiles student writing across the semester (every essay and exercise is represented, but these have been revised and trimmed by the students for content and context). During the last several weeks of the semester, and in parallel with other individual and group projects, students worked in small groups on various aspects of the project: a Research Group compiled our timeline and wrote introductory and concluding statements; an Editorial Group collected and compiled various pieces of writing from their fellow students; a Design Group worked on layout and executed the actual physical covers by letterpress; a Documentation Group collected photographs and film recording our various activities and trips during the semester; and an “AR-Group” (Augmented-Reality) worked to develop and deploy AR overlays onto the physical book.
In an interesting blending of the physical and digital aspects of the book, the AR cues (the images or glyphs at the head of each piece of writing) were in fact printed letterpress in another project, and then scanned and digitally inserted into this book. So the AR cues are in fact digital manipulations of material elements. As a hybrid object—including the process of its construction, the writing and revising that went into the book, the AR additions (and challenges) to the book, and the experiential activities the book documents—the book represents the students thinking both collaboratively and on their own about the history of the book as we’ve explored it, and its possible futures (even and especially in an increasingly digital and digitized environment).
To view the augmented reality elements of this book, please:
Download HP Reveal (formerly Aurasma) from the iTunes or Google Play Stores.
The app will open to the Viewer. Tap the search icon at the bottom of the view screen to get to the Explore page, at the bottom of which you’ll find another search icon that will take you to the Search page.
Search for rid.hamilton. From the results, select rid.hamilton’s Public Auras, and click follow.
Return to the HP Reveal Viewer to explore the book.
For the HP Reveal viewer to register the glyphs in the book displayed below, use the buttons above the book to enter fullscreen or zoom in. If the book is not displaying below, follow this link >>
We’ve had a pleasant surprise, as we begin finals week here at Hamilton College—the arrival of the Dremel 3D40-EDU! This easy-to-use 3D printer geared towards education boasts a non-clogging nozzle, a quiet operation, a learning community, and web-based design and printing software.
We’ll be figuring out where in Burke library this 3D printer will live, and setting it up over winter break! We anticipate using this for Visiting Assistant Professor of Literature Nhora Serrano’s “Dream a Little Dream” course, as well as for our student interns and Digital Media Tutors to get hands-on experience running and maintaining a 3D printer.
All 52 students in Contemporary Computing Concepts have met with our interns and experienced a half-hour of Virtual Reality with the HTC VIVE!
Students expressed universally positive sentiments about this as a learning experience; one student said “Experiencing VR helped give me a better understanding of VR & UX technologies. It was cool to actually have an immersive experience rather than just using my phone.” It also helped students understand how advanced contemporary VR can be. One student noted that of tying out the VIVE, “This completely changed my understanding of VR. Previously, I thought of VR as some sort of gimmick. Experiencing it first hand has made me realize that this technology is extremely impressive.”
The students even hypothesized about some possible uses of VR:
– “I forsee using this tech in a class setting to gain better understanding of molecular structures commonly seen in bio or chem.”
– “Yes, to learn how to do things on the job before actually doing it (for example doing a practice surgery).”
– “Kids learning about the ocean could take a ‘trip’ to the ocean and businesses could see what their product would look like before creating it.”
– “In the future it could be used to hold meetings between people far away.”
Our VIVE saw 10 hours of use for this course this week, bringing it to 31 hours total for this assignment. We had a hiccup mid-week when the system stopped functioning for an unknown reason; it turned out that it needed a firmware update, which we were able to do the next day. After seeing our system get this much use, we’re hoping to move towards a model where the sensors are mounted on the walls, instead of mounting the sensors on tripods with interns setting them up and taking them down each time.
The first week of Virtual Reality for Contemporary Computing Concepts is done! Taught by Professor of Computer Science Stuart Hirshfield, students in this introductory course are using the HTC VIVE to get a hands-on experience of Virtual Reality, which is the contemporary issue the course ends on. The students have been meeting one-on-one with our paraprofessional student interns, in half hour blocks.
In the first week, we’ve used the VR system for 21 hours and met with 38 out of the 52 students in the course. There were three no-shows. Scheduling was done through a Google Calendar for the VIVE system, handled entirely by student interns.
This week, we ran into several tech challenges with Visiting Assistant Professor of Literature Andrew Rippeon’s “Unpacking My Library” course.
Following our prior attempt to produce 3D scans of “altered book” art objects, the students decided that the noise and random digital alteration from the imperfect 3D scans worked to augment their project. We continued 3D scanning, leaning in to the Sprout as a creative process. Some of the scans can be viewed on Remix3D.
Previously, we had thought Aurasma might be the perfect solution for our AR needs. However, while Aurasma does support 3D models, it requires a specific file type; the best practices involve designing scenes in Maya, and so it does not work quickly with 3D scans. Given the scope of this course, we have opted to not use Aurasma’s 3D model support. Instead, we plan to store our 3D models in a repository online, and make use of Aurasma’s capacity to link to websites, triggering a visit to the 3D model repository when the Aura is tapped.
Uploading the 3D scans proved another challenge. The students had saved them as .3mf files, the default for the Sprout Pro. These files do not upload to Sketchfab, which is the site we’ve generally used for showing our 3D models. You can think of it as a YouTube or Flickr for 3D designs and scans!
We had some trouble finding an app that would work to convert these into a 3D file type that Sketchfab recognized. We eventually used Paint 3D to convert the files, but the textures did not come across; given time constraints, rather that trying to extract textures, we used Paint 3D’s able to upload the files it can open to Remix3D, another 3D model site. Unfortunately, Remix3D has a file size limit, so scans which captured a lot of texture could not be uploaded. Remix3D also has community standards that removed several of the altered books, including one which had featured the word “abortion,” making it not ideal for an educational environment that may produce projects about controversial topics.
The Sprout also scans in .obj files, which we work with more routinely in our 3D scanning, model creation, and 3D printing; we plan to conduct future Sprout scans as .obj files.
Using the Sprout Pro, students from Visiting Assistant Professor of Literature Andrew Rippeon’s “Unpacking My Library” course created 2D and 3D scans of “altered books,” book art objects the students had made, as part of the course, from books the library was discarding.
The hope is to use these scans as overlay images for the AR components of the book that the class will be writing, designing, and printing for their final project.
This is our first course use of the Sprout Pro and was a learning process for all. The Sprout Pro had some trouble tracking and 3D scanning the books, which were often too large or had moving parts. Even when we did get a scan of a significant portion of the geometry of the object, it would often loose tracking while taking photos for the texture; this resulted in the texture being placed on the wrong parts of the object. We briefly contemplated setting up the HP Structured Light Scanner in our Room of The Future, as that scanner may be a more appropriate tool for this project, but we didn’t have a good place to do so in the room—this is something our team will have to work out in the future.
We decided our time was best spent capturing 2D images of the books, to insure we would have images of the student’s work to include in their AR book. The 2D scanning feature of the Sprout Pro G2 was intuitive, and students especially enjoyed the touch screen. We plan to attempt to 3D scan with the Sprout again during the next class session.
Here are some of the 2D scans which students took with the Sprout:
Our new laser cutter has arrived! Thanks to EDUCAUSE and HP for helping make this dream a reality. A laser cutter is a digital fabrication machine that can engrave or cut with precision into a wide range of materials like plastics, wood, leather, fabric, and paper. This enables a new range of possibilities for faculty learning goals and student projects.
This laser cutter will enhance the work in Visiting Assistant Professor of Literature Andrew Rippeon’s letterpress studio, allowing the creation of new printing materials (which we plan to incorporate into augmented reality explorations). Laser cutting will also be useful for our Theatre and Art Departments in a wide range of applications including set design, puppetry, and photography.
Our laser cutter is Universal Laser Systems’ VLS 3.50. Universal is a highly regarded leader in the field, known for their easy to use, professional grade lasers and their user software interface. Specifically, Universal’s material library with presets for specific materials and their 1-Touch Photo software for easily processing photographic images for engraving make them ideal for an educational learning environment. Our laser cutter is two feet by one foot, allowing us to easily use standard wood sizes. In consultation with Physical Plant and several interested faculty and staff about needs and safety, our laser cutter will be housed in the Theatre Department’s costume shop, pending a future space for 3D and digital fabrication technologies.
Students from Unpacking My Library met with Educational Technologist Kyle Burnham to get the ball rolling on creating a book with augmented reality components. The class is broken up into several different project groups, including an AR group researching and creating the augmented reality components, and a documentation group recording and photographing the process.
The students previously produced a broadside on a campus letter press, which a glyph representing each student. The idea is to mix the revolutionary 16th century letterpress technology with our modern day media revolution by virtually augmenting the broadside, such that when viewed with a phone, each glyph displays information related to each student, such as photos taken by the documentation group.
Today, we digitally scanned the broadside, and isolated each glyph to designate them as “trigger images” in Aurasma Studio. Trigger images are what the phone’s viewer searches for that trigger the augmentation to appear and overlay the real world. In future sessions, we will connect the trigger images to the overlay images, which can be 2D and 3D images and video.
We’re pleased to announce an additional course will be joining this endeavor, aiming to understand and improve the use of 3D technologies in academia with EDUCAUSE and HP! Visiting Assistant Professor of Biology Jessica Fellmeth’s Anatomy course this Spring will be using the HTC VIVE as part of their lab work, adding to the two courses from this semester in helping us collect data on the best practices for bringing virtual reality (VR) into the classroom.
The connection with Fellmeth’s class was made by our Digital Media Intern and Pre-Med student Rylie Maineville ‘18. While helping oversee our virtual reality system, Mainville was inspired to consider how VR might be of use to students studying anatomy. Learners can see organs within the body, without having to dissect a cadaver, and these organs can be witnessed while simulating body functions and diseases. This creates a much more interactive, visual learning environment to aid and cement understanding.
This course will be using two VR applications: Organon and YOU by Sharecare. We will kick off the course by bringing the HTC VIVE to the lab for a show and tell, guiding the students with hands-on use of the VIVE and the two environments. The VIVE will be used weekly, to examine the different parts of human anatomy under study, by 20 students in four or five groups. We will generate weekly reports on VR usage for the class (frequency and times used for each group/individual), which we will give to Fellmeth to explore if greater use of the VR applications improves understanding in class. There will also be video lessons, yet to be determine, where Jessica will use VR in conjunction with her lecture.
Here are some screenshots of the VR applications in use:
The primary project team for this course is Scott Paul, Ben Salzman, and Rylie Mainville ’18.
The Research and Instructional Design (RID) team of Hamilton College’s Library and Information Technology Services (LITS) has acquired the use of a room on Burke’s third floor of the library for our Campus of the Future technology! With windows on two sides, we have a gorgeous view of the quad and the entrance to the library.
We’ve started setting up the equipment—there’s just enough room for a small-scale VIVE experience! The Sprout Pro, zWorkstation with DreamColor Z27x Studio Display, and the 3D Structured Light Scanner Pro S3 will also call this room home. A cabinet houses miscellaneous supplies and other technologies, like our 3D printer filament. Our Instructional Technology Apprentice Judy Zhou ‘19 will be getting comfortable in this space as she learns about our new tech!
UPDATE: The Aurasma phone app is now known as HP Reveal.
The students in Andrew Rippeon’s “Unpacking My Library” course visited the library today to learn about Aurasma, an easy-to-use phone application for augmented reality. We’re hoping to use this app to link digital images, video, and 3D scans into the book the students are creating for the course. Of the various ways this could be possible, we think Aurasma might be the best for us: it’s accessible to anyone with a phone, free, and fairly simple.
Anyone will be able to follow the Aurasma account for the book, and by holding their phone above the book as they read, will see the images, video, and 3D objects appear on the page — and perhaps wander across it! A cool feature of Aurasma is that anyone could make their own augmentations on the same images and “remix” the book.
You can try it out with the test images we used in class, made by Educational Technologist Kyle Burnham in Illustrator. To see the images transform, download Aurasma (iTunes, Play Store), search for #unpackingmylibrary, tap and “follow” the two images below. If no “auras” appear after searching, or you can’t find where to search, Aurasma has documentation that can help you find and follow auras. You can also follow Kyle’s Aurasma account, kburnham3, and follow their Public Aura’s channel.
You can get hands-on and augment these images yourself, by taking a photo of them within the Aurasma app, or uploading them to Aurasma’s online studio.
The group of students working on the AR component of the book will meet again with Kyle in a few weeks to bring their ideas for this to life!
We’ve finished with the last of five sessions exploring a variety of 3D technology with Economics of Technology and Innovation! The students, most of whom had never experience VR, AR, 3D scanning, and 3D printing before, clearly enjoyed the sessions.
To get feedback, Research Librarian for Teaching and Learning Initiatives Alex Rihm and Educational Technologist Kyle Burnham created the following survey, which we handed out at the end of each session:
How did experiencing the technology(ies) change your understanding? What did you gain from experiencing the technology that you would not have grasped otherwise?
Do you foresee using this knowledge/understanding in the future, either in classes or professionally? If so, how?
What could be done to improve this process/experience?
The feedback was positive, with students expecting they might use these technologies in the future, and feeling a greater familiarity with them and their applications. One student wrote, “It definitely helped me grasp how technology is rapidly bettering & could affect economies,” and another, “I can now talk about 3D printing and VR with more experience”. Universally, the students expressed that the realism of the experience in the VIVE was unexpected.
Professor of Economics Chris Georges was also pleased with these sessions—so much so that he scheduled the 10-student Posse he mentors to come in next Sunday and get the same session! This time we’ll have two student interns and rotate two groups of five participants through the technology.
When so many students need to use a VIVE headset—a limited, one-at-a-time resource—how do we schedule successfully? For the two courses we are supporting this Fall semester that will be utilizing our headset, we’ve recognized that the experience of VR will have to be outside of scheduled class time.
For Economics of Tech and Innovation, we’re using a Doodle poll, making use of their features like limiting selection to one poll option, capping participants (in this case, at five), and ad-hoc recurrence for dates and times.
One of the goals of Economics of Tech and Innovation is using this hands-on experience to think about these disruptive fields. Given that, we wanted to supplement our paraprofessional student interns with professional staff to answer questions not just about the specific technologies, but the field more broadly. We thought we could also help our student interns manage four or five students exploring four different technologies, providing a mentorship opportunity between our professional and student support staff. Due to the class and sport schedules of Chris Georges’ students, the times that worked for them were all evenings; of these, we worked around Kyle and Ben’s schedules and events in the library to plan five sessions, enough for all the students.
For Contemporary Issues in Computer Science, we’ll be scheduling with Google Calendar, and it will be completely handled by our student interns. Our entire student staff will be trained in supporting VR, so Hirshfield’s students will be able to schedule a one-on-one appointment with the VIVE any time Burke Library’s Research & Design Studio is open.
Structured light scanning uses projected light and a camera to measure and record high definition 3D images. From HP’s DAVID Vision Systems, this professional 3D scanner, when paired with HP’s Automatic Turntable and HP Scan 5 software, “creates precise 360° 3D models of even the most complex items.”
A multimedia workstation, HP describes the Sprout Pro G2 as enabling us to “Manipulate the physical and digital worlds in innovative ways with immersive technology that’s built with a PC, hi-res cameras, Touch Mat and 2D and 3D scanning capabilities.” We’re excited to begin incorporating this into our educational technology support.
The HTC VIVE is a virtual reality headset for a fully immersive environment. From climbing Mount Everest to interacting with internal organs, the VIVE transports learners and brings new or enhanced experiences to the curriculum.
We’ll be exploring Best Practices for bringing the VIVE into the classroom with three courses for which experiencing the VIVE and its applications will be homework or lab time, and using the VIVE for Visiting Assistant Professor of Literature Nhora Serrano’s “Dream a Little Dream” course in the Spring ‘18 semester.
According to HP’s website, this professional studio display allows us to “Work in brilliant, trusted color and bring your ideas to life with the HP DreamColor Z27x Studio Display, featuring HP’s unrivaled integrated calibration engine, 4K input support, and 10-bit color that drives up to 1.07 billion onscreen colors.” We’ve hooked up our DreamColor 727x display to our zWorkstation Z640.
Our intern Judy Zhou ‘19 has already been using this setup for her design projects, as will students in Visiting Assistant Professor of Literature Nhora Serrano’s “Dream a Little Dream” course in the Spring ‘18 semester—an aptly named course for this vibrant display!
An impressive tower computer workstation with a powerful graphics card, the zWorkstation Z640 is capable of running multiple VR headsets. It’s known for its whisper-quiet performance.
We anticipate using this for Visiting Assistant Professor of Literature Nhora Serrano’s “Dream a Little Dream” course, when her students create virtual reality environments based off virtual worlds in literature. Our Instructional Technology intern Judy Zhou ’19 will also be working with this workstation.
Students in Andrew Rippeon’s “Unpacking My Library: The Book, The Burke, and the 20th Century” (Literature & Creative Writing) are introduced to the history and practice of the book in a long arc from the pre-Gutenberg era into the present. With a focus on the 20th century, Rippeon’s students consider “the book”and “the library” as literary, theoretical, and material engagements: what does it mean to curate a library? How do technological developments bear upon information? How do authors and artists respond to these questions? Over the semester, and in addition to reading in these contexts and to writing their own original critical essays, students make broadsides and books, curate micro-libraries, and produce (as a hard-copy book) an anthology of their writing.
In this iteration of the course, enabled by the EDUCAUSE/HP initiative, students will create their own charged technological context for the book: how does an augmented-reality book further pressurize the context we’re discussing? Students will use 3D technologies (3D printing and the Sprout Pro learning station), and augmented reality applications to produce a book that has a much broader material-technological footprint, at once engaging with and commenting upon the status of the book in the 21st century. We intend to produce an augmented-reality book that documents its own context and production.
Continuing their engagement with “the book” in a technologically mediated environment (at once threatening and enabling), with the assistance of partners in Digital Humanities, Computer Science, and Research & Instructional Design, students will develop and build an Augmented Reality “Book of Sand.” Inspired by the Borges story of the same title and the Argentinian artist Mariano Sardón’s treatment of that text, students in the class will construct a physical sand-table, and with the assistance of senior Computer Science thesis groups, participate in coding a digital library for “use” on the sand-table. The end-goal is to produce a digital-material interface for the purposes of provoking interactions with (rather than passive reception of) individual texts, genetic iterations of a single text, and larger digital libraries.
Students will be assessed on the conceptualization and mechanics of their writing, as they would in any other literature or writing-intensive class, with the additional element being that they will also be asked to engage with the conceptual and technological pressures brought by technology to the book itself. Students will be asked to reflect in writing upon the past, present, and futures of the book, as enabled, enacted, and destabilized by these technological developments. Students will work in groups over the course of the semester as they execute various material and conceptual tasks involved in the production of the class book. Evaluation of our success creating the class book will be based upon our integration and mastery of the technological elements involved—AR platforms, incorporation of images, film, animation, and sound in the documentation, 3D scanning and printing, and laser-cutting. Success on the classroom side of the equation will require student reflection upon and critical engagement with the AR-enabled book, and how these platforms bear upon the trajectory of the book from its origins to the present.
Dream a Little Dream: Virtual Realities and Literature, taught by Nhora Serrano (Literature & Creative Writing) will introduce students to the representation of virtual worlds in literature, and how these ‘dreamscapes’ have transformed our understanding and experience of the “real.” Rooted in Borges and Cortázar and featuring authors such as L. Frank Baum, Lewis Carroll, and Murakami, students will use 3D technologies to imagine, design, and create virtual objects and environments based on the course readings. We anticipate a library research component to be embedded within this project.
As a Medieval specialist, Serrano is interested in exploring how virtual worlds are not a new concept, though the ways we explore them now are different. Serrano teaches about mimesis/representation and visuality, for course that engage the artificial and the real on different levels, from storyboarding to 3D technology like VR and 3D printing. To enhance their analytical skills, Serrano has her student engage with visual elements through creating; in this case, by creating a virtual environment students can better learn about the significance of virtual and “dream” worlds in the texts they read, through recreating them, gaining a better sense of what the virtuality of that world represents thematically within the text.
The main assignment for the course will be to create a virtual moment from a book read in class (in groups of four or five students). Students will learn Unity to create a virtual environments that will be explorable with the HTC Vive connected to the zWorkstation and DreamColor display. Students will research and use found 3D models, and scan their own 3D models for use in this environment using the Sprout Pro and the HP Structured Light Scanner. As a scaffolded individual assignment in service of the final project, each student will select an object from a book read in class, and locate and 3D scan a real object that represents the book object (using the Sprout Pro or Structured Light Scanner). Students will have to justify the choice of object from the book and justify the choice of the real object used to represent it virtually.
Students will develop familiarity and interest with digital techniques for creative analysis and future use;
Students will use simple 3D scanning to gain technological skills;
Students will build team-working skills by creating a virtual environment in groups;
Students will connect ideas about virtual/dream worlds from the text to our real(?) world;
Students will evaluate various levels of representation and “the real” though texts, the worlds within text, and objects and virtuality in our own world.
Faculty: Nhora Serrano
Instructional Designer and 3d Instructor: Ben Salzman
with support from Scott Paul and Digital Media Interns
The GeoSciences 3D Scanning Project will offer a virtual in-depth perspective of the College’s GeoSciences Mineral collection for teaching and research. The collection, which contains samples acquired by faculty, students and alumni over the past 200 years, is utilized by neighboring schools across Central New York.
Hamilton College’s Instructional Technology Apprentice and a student majoring in GeoSciences will be given an opportunity to embark on a experiential learning project of archiving and digitizing the college’s mineral collection. This will involve both students learning the art of 3D scanning, along with the standard techniques for cataloging and organizing metadata schemes.
The HP 3D Structured Light Scanner Pro S3 will be utilized to scan the minerals from the collection and the zWorkstation will provide the processing power and help us archive the collection.
Students will digitize the mineral collection to make it available to the online community.
Students will learn the process of 3D scanning objects of various sizes, shapes, textures, and composition.
Students will work with basic metadata formatting and develop database management skills.
Team Faculty: Dave Bailey Instructional and 3d Technology Designer: Ben Salzman
with undergraduate Instructional Technology Apprentice Judy Zhou
During the Fall 2017 Semester, two separate courses (Contemporary Issues in Computer Science taught by Stu Hirschfield & Economics of Technology and Innovation taught by Chris Georges) at Hamilton will be exploring how to successfully supplement classroom study with an experience of the technology under discussion. The challenge for the Research & Instructional Design team is providing instruction to many students in a limited amount of class time. In total, 70+ students will experience 3D technologies as related to their subject of study. This will be accomplished through two modes: 1) one course (50 students in Computer Science) will focus on using VR and will work with our teams’ course support student interns to access the HTC Vive in one-on-one appointments, 2) the other course (20 students in Economics) will explore 3D technology more broadly (VR, AR, 3D scanning and printing) in several small groups with a professional staff member and student interns on hand. The Instructional Support Leads hope to determine best practices in bringing VR and other 3D experiences to groups, and to evaluate the individual vs the group experience (including one tech vs multi-tech experiences).
Instruction will be provided through two modes: 1) one course (50 students in Computer Science) will focus on using VR and will work with our teams’ course support student interns to access the HTC Vive in one-on-one appointments, 2) the other course (20 students in Economics) will explore 3D technology more broadly (VR with the HTC VIVE, AR, 3D scanning and printing) in several small groups with a professional staff member and student interns on hand.
We will gather assessment information via a student feedback survey and through a debrief with the faculty members, professional staff, and student interns.
Goals and objectives:
Research team members will determine best practices in bringing VR and other 3D experiences to a classroom setting;
Research team members will evaluate the individual vs group experience (including one tech vs multi-tech experiences).
NEW: In the Spring 2018 Semester, we will be using the VIVE for Visiting Assistant Professor of Biology Jessica Fellmeth’s Anatomy lab course, enabling students to see and interact with the internal organs they’re studying.
Team Faculty: Chris Georges, Stu Hirschfield, Jessica Fellmeth Instructional Support Leads: Alex Rihm, Ben Salzman, Kyle Burnham, and Scott Paul
with undergraduate instructional support interns
Sam Pellman (Music) will research and experiment with new methods to utilize OSC commands for live augmented reality musical performance in his project: Exploring Sound Design and Musical Composition with Augmented Reality and Virtual Reality. Spatial audio and ambisonics for VR and AR environments provide incredible opportunities for the live performance of electroacoustic music. Additionally 360 video in conjunction with ambisonic microphones also offers unique methods to record classical performances in immersive ways. This project will also result in the creation of multimedia experiences including works for VR and the reprocessing of a piece Selected Cosmos: Sounds of Life which is a sonification of the base pair sequence of human DNA (as reported by the Human Genome Project).
HTC Vive and zWorkstation will be utilized to run and create all of the VR and AR elements and environments. The Sprout Pro G2 will be used to experiment with scanning and the manipulating of scanned objects for the multimedia pieces. HP DreamColor Studio Display will be utilized for all image mastering and color correcting. An important note, because of the incredible graphics card in the Z640, we will utilize it for all the video rendering for Selected Cosmos: Sounds of Life.
This research project will include three components: First, to build off of the current production and performance of an AR enhanced composition (performed at the annual Kyma International Sound Symposium in Oslo, Norway on October 14, 2017). Sam Pellman and Ben Salzman (3d Designer) will continue to research methods and techniques for using Kyma and the VIVE headset. This research will include how to use realtime audio production in Kyma which is synced and controlled with the game development engine Unity.
The second component of the project will involve re-rendering the video for a work Selected Cosmos: Sounds of Life, a sonification of the base pair sequence of human DNA (as reported by the Human Genome Project). The zWorkstation z640 will allow Pellman the ability to properly process the large amounts of visuals that were created through a MAX/MSP Jitter patch. These visuals are meant to accompany the existing audio element.
The third component of this research project, scheduled to take place in the spring 2018 semester, will explore how prerecorded versions of concert music can be rendered in VR environment. This will possibly utilize 360 video and an ambisonic microphone to record faculty performances. Pellman will be identifying possible appropriate faculty members within the department. The performances will be posted online. This portion of the project will rely heavily on utilizing the Z640 workstation in conjunction with the VIVE headset.
Research team members will learn and develop techniques for live VR & AR sound design.
Research team members will create new multimedia works for this medium utilizing HP equipment.
Research team members will experiment with the capabilities and range of processing VR with the Z640 Workstation.
Research team members will explore how VR, Spatial Audio and Ambisonic microphones can record classical concerts.
Faculty: Sam Pellman 3d Technology Designer: Ben Salzman