The Space – TimefireVR

Office Space of TimefireVR

We are almost ready to tackle our objective of building some kick-ass Virtual Reality, but there are the hurdles of the unknown, unknowns of which some have been becoming known as time passes. There is much to do in order to build something as audacious as virtual reality. Knowledge is probably the biggest factor in proceeding with this endeavor, and we have to constantly check ourselves to verify if we have enough.

First, we have the gamut of hardware requirements. The computers we have built ourselves are based on Intel’s i7-4930k, 16GB of RAM (except one PC which needed 64GB of RAM – he’s the UE4 expert among us), EVGA GTX-780Ti’s, and some extraordinarily large 27″ Asus monitors that let us see a glorious 2560×1440 resolution each. We built them on open benchcases, so we are constantly reminded of the horsepower at our fingertips.

That’s not the only hardware we need. There is the matter of the Oculus Rift virtual reality headset, where we are allowed to peek into the immersive world of the virtual. Not only do we have a DK1 (Developers Kit ver. 1) that I acquired during their now-famous Kickstarter, but I also nabbed a prototype Oculus HD while at GDC back in March. Sitting on another desk is also a test unit from Microsoft of the Kinect 2, a Razer Hydra, a game controller, and some headphones that will allow us to hear sounds in the far away corners of the VR world we are building.

For software, we are exercising the low-budget muscles. Foremost among these tools is Epic’s Unreal Engine 4 (UE4). It was at the Game Developers Conference (GDC) in San Francisco that Epic crushed the dreams of other game engine developers while inflating ours with a price of only $19 a month per seat instead of the stratospheric prices of yore that were so exorbitant they couldn’t even be spoken. Next up in our arsenal is the open-source and FREE 3D software known as Blender. Without Blender, we would be faced with tens of thousands of dollars for our mesh creation and animation tools. Instead, we have saved a substantial amount of money that is being better spent on payroll. Equally as important to our workflow is Allegorithmic’s suite of tools: Substance Designer and Substance Painter. While we are presently on hold regarding the lack of easy-to-use native support or a friendly-to-use plugin that ties together UE4 and Substance, we are patient, and in any case, we have so much mesh modeling ahead of us along with filling those knowledge gaps I spoke of, we can wait. Rounding out our toolkit are 3D-Coat, MakeHuman, and TerreSculptor.

Right now, we are only three people, but we have the ability to hire others. We have to hold off bringing in others as there’s a lot for us to do now that we are formally a company and not a hobby. One guy is digging hard into UE4 to better understand Materials, Blueprints, splines, and plugins, compiling our own versions, and staying up with the crazy openness of Epic and their new philosophy of sharing everything. Brinn, being relatively new to mesh modeling and Blender, is spending 80 hours a week mastering this half of his responsibility to our new company. His other task will be our soundtrack and audio work. Me, well I’m knee-deep in business affairs and working through the kinks of our funding, though I have to say that the financial group we have behind us is amazing and has surprised me with the speed they stepped up to make it all happen. Thanks, guys, and especially Jeffrey Rassas who lead these efforts.

Employee #0 – The Founder – TimefireVR

Caroline Wise and John Wise in Denmark

The guy on the right that’s me, John Wise the Founder of Timefire. The woman with me is my wife, Caroline. Not only were we in Germany on the Danish border in part to celebrate my 50th birthday, but we were also only a month away from returning home and taking possession of an Oculus Rift. So, in a sense, this is the last time I had a “normal” photo taken of me, as life is now measured in B.O. and A.O. (not Body Odor and Ambient Occlusion – but as Before Oculus and After Oculus). I don’t think anyone with an imagination who has looked into the Oculus headset has come away from it untouched. For me, it was apparent that Virtual Reality was finally on track to do something big, but what I couldn’t know last May was just how much.

Point Cloud image for Timefirevr

Almost a year ago, a friend and I started working on some experiments to see what things we built would look like in VR; we weren’t disappointed. While the first Oculus Rift headset had ginormous pixels and enough motion blur to make the most stoic queasy in minutes, it was easy to see that Palmer Luckey was onto something. So we started learning all we could about what might be entailed in building a Virtual Reality environment. We embarked on learning about low-poly and high-poly meshes. UV unwrapping was easy compared to our struggles with lightmaps, which seemed like evil magic that only wizards could master. Being noobs, we would have to learn all we could about the difference surrounding 256, 512, 1024, 2k, and 4k textures and how did these work as a “Texture Atlas?” Level of Detail is an important factor to game devs, Breaking Bad didn’t teach us anything about cooking up a good batch of LOD. How about terrain and plants? Well, how many ways to you want to learn about how to make either? We’ll also have to take a look at motion capture, animation, rigging, character development, lighting, weather, and a host of other functions before we begin to start feeling competent about authoring a game.

Just as we think we’re getting somewhere, events conspire to deliver a huge setback while simultaneously offering a springboard into the future. Seeing as much of what we were playing with was “practice,” when the game industry started to make these transitional moves we were witnessing, we didn’t have a lot to scrap. First up was ditching bitmap textures and learning to dance with the new kids on the block – procedural textures and Physically Based Rendering (PBR). In December, I took a road trip to Hollywood for the first meeting with the great minds of Allegorithmic. That night, I was introduced to the Alpha version of Substance Painter. The writing was on the wall; we were going through a monumental shift for the game developer; little did I know that this was just the tip of the iceberg.

Canal View of VR city in TimefireVR

After learning from Oculus that Valve Corporation was hosting Steam Dev Days the following month in Seattle, Washington, I took a shot and wrote Gabe Newell to inquire about gaining an attendance code that would allow me to register for the event. Within two days, I was in, and tickets and hotel reservations were made. Following Gabe’s keynote and the unsuccessful attempt to ask the guy a question, I followed the founder of Valve out to the lobby, where I had the opportunity to talk with him for a few minutes. This was my lucky 50th birthday year; Gabe took me over to the “Booth” – the VIP room where a small handful of lucky industry people would get to be witness to the work Valve had been doing on their own VR headset. Standing there, I was introduced to Michael Abrash, Tom Forsyth, and Atman Binstock. It was Atman who led me through the demo. OMG, this VR is amazingly tear-inducingly overwhelming.

Then, in March, another shoe dropped in the cascade of events that convinced me it was time to take action; Epic released Unreal Engine 4 at the incredibly low price of only $19 a month, and it was available to everyone immediately. Oh, that and Oculus showed Developers Kit v.2, which was utilizing some of the magic that Valve was sharing with them. In retrospect, it is obvious why Mark Zuckerberg would be so blown away by the technology that he would have to buy Oculus. It would also be that pivotal moment following GDC in March that would have me recognizing I would have to transition from a hobby project to a serious company armed with the right tools and staffing to be able to deliver a compelling story and environment.

And so Timefire was born.

My previous work includes active participation in the German beginnings of Techno music and bringing the first 3D computer graphics to the European continent. Along the way, my interest in communication and video arts saw me working with Nirvana (1989), The Sugarcubes (Bjork 1988 and 1989), Henry Rollins, Psychic TV, The Pixies, Nitzer Ebb, and many others. Upon my return to the United States after a 10-year extended sojourn, I opened one of the first 25 Internet Cafes on Earth in Scottsdale, Arizona, before moving between various tech companies, culminating with my initiative to build the world’s first sub-quarter million-dollar clustered supercomputer with Jeffrey Rassas – a current partner and the guy who has lead the fundraising that is allowing us to compete in building what we hope will be a viable and interesting VR platform. My current interest in VR was originally born over 20 years ago when the expense and poor quality limited anyone from making a serious attempt at bringing the technology to the masses. Today, those limitations have been removed, and the world is about to undergo a fundamental and profound shift as we are thrust into the FUTURE.

A Start – TimefireVR

TimefireVR Team

Four days ago, an old friend of mine, Jeffrey Rassas, showed me some office space he had available almost exactly across the street from where he and I had worked together nearly 15 years earlier. It’s only been about two weeks since he and I started discussing a business opportunity that would see him getting seriously involved with my new project – building a Virtual Reality environment.

Within hours of seeing the space, I grabbed two guys who were the most likely to join me in building a virtual world: redacted (center) and Brinn Aaron (left); that’s me, John Wise, on the right. We got together and took a drive to Ikea to order the pieces we’d need for our desks, and then we headed over to my place to order the computers. The desks arrived on the 24th, and the computer parts on the 25th. Now, here on April 26th, we are ready to start work. Timefire LLC is born.

Going to GDC – TimefireVR

GDC Online 2012 (Thursday 10/11) GDC Signage

Soon, I will leave for my first Game Developers Conference, better known simply as GDC. It’s held annually in San Francisco, California, which I’ll be road-tripping my way up to on a 1,532-mile (2,482km) adventure. I’m writing this blog entry as a two-part story to get some of the planning details, expectations, and sense of excitement leading into this out of the way. This way, my next post will focus on the event itself.

GDC is the world’s largest and longest-running professionals-only game industry event; their description not mine. How do they know it’s pro’s only? Probably because of the expense to attend, this isn’t cheap. Exposition floor tickets alone are $195 each, while a full pass will cost you between $1,475 and $2,100, depending on when you register. Add transportation, food, and hotels, and soon, a small developer will approach close to $1000 in costs for even a minimal attendance option – per person!

Last summer, I started considering a trip to GDC. Earlier in the year, it had been impossible for us not to see the announcements coming from the conference regarding Oculus and the “Infiltrator” demo showcasing Epic’s Unreal Engine 4 (UE4). So I went over to the GDC site, but prices and options to purchase were not posted yet. Then, in September or October, the information went live, and for a brief moment, I had the opportunity to buy admission to the Independent Games Summit (IGS) for only $695. I also read on the internet that first-time attendees shouldn’t worry about the full conference and summits, that just visiting was enough, even overwhelming. While I progressed in my own development regarding my VR project, I finally returned to the registration page. The IGS passes were sold out, which settled what part of the event I’d be attending. The lesson here is to plan well in advance, better than six months out if you are thinking of going. Read everything you can about other people’s experiences and start saving money.

So, what’s the attraction to someone just getting started in the game industry? This is our Superbowl, our World Cup, it is the Olympics and the Oscars all rolled into one giant geek fest celebrating the developers who are changing how we see and interact with our world. Gaming is not just some teen sitting in isolation killing zombies; it is an evolving phenomenon that alters humanity’s relationship to education, entertainment, social interactions, medicine, work, war, and soon how we travel. The companies that are presenting workshops or have booth space often hold off on making major product announcements leading up to GDC because they know the world is listening during this event. It is in large part these announcements and demonstrations that played a role in my decision to attend.

The first and foremost among those hoped-for announcements will be from Epic or maybe Oculus. When the doors open, I suppose I’ll have to flip a coin to decide whose booth I’ll bolt to first. Both companies are likely to make some major announcements, so large that, in retrospect, this will be one of those defining moments in history that we as a society look back to the events of this week to recognize this was when we first learned the world was changing in such a dramatic fashion.

For about ten years, Epic has been working on its next-generation game authoring engine. There is no guarantee that it is ready yet to make a public appearance, but signs that Epic may be about to unleash this super-charged tool are many. Over the past seven months, Epic has rolled out five videos that have allowed us to see the software and has put on display its User Interface. Their booth space, in comparison to the previous year, is huge. At CES (Consumer Electronics Show) in Vegas, they allowed Nvidia to run a UE4 demo and prepared a special presentation for the guys at Oculus to showcase their Crystal Cove prototype, which Valve was also enthusiastically supporting. With Sony and Avegant breathing down Oculus’s neck in the race to reach the consumer market with VR, it would appear to me that it is in Epic’s own interest as an early supporter of the technology to help push out what could be the premiere VR authoring environment.

Then we have Oculus themselves, who have an amazing amount of floor space reserved at GDC, three separate booths, as a matter of fact, on two floors. Previous statements have said they would be ready with a consumer model when they figure out positional tracking; well, that’s exactly what they were showing in early January at CES. Another 60 days have passed, and I’m certain they have not slowed down on refining how their technology works. If they were to announce Dev Kit 2 (DK2) at GDC, my guess is they would give themselves enough lead time to prepare the units but maybe a bit of extra time to figure into things if they knew certain elements were about to find their way into a DK2 so that they can tempt us with an announcement that new units for developers will ship within a relatively short time after GDC. They are certainly not going to GDC with 3,000 square feet of space to simply show us what we have all already seen. I expect something HUGE!

Next up in importance is yet another encounter with Allegorithmic. Just this past week, they released to the general public a beta version of Substance Painter on the Steam Early Access program operated by Valve. Last month, they sent out invites to those who might be interested in attending a Substance User Group meeting to be held during GDC; I wasted no time on RSVP’ing my answer, which in turn is forcing me to leave a day early. I am looking forward to learning about plans going forward regarding Painter and what they have in mind for how Substance Designer is to be improved upon.

Marmoset will be present, but Toolbag 2.0 was recently released, so I think they’ll be there mostly to meet with their user base to get feedback and let some lucky few know what they have in mind for future versions. Speaking of software on the horizon, Quixel has a booth, and there is no way they are there only to show their current version of nDo2, which is having problems operating with Photoshop CC for a number of users. I’m pretty certain we’ll see a new version of nDo, dDo, and their new Megascan service.

These are just the major areas of interest I’m at GDC for. There’s also Simplygon, Nvidia, Valve, Speedtree, Perforce, the Belgian Trade Commission, and dozens of other vendors we are yet to discover that has me thrilled about the prospect of being on hand for this amazing conference.

Then there are the parties.

Tuesday, after our user group meeting, I’ll head down the street to the NativeX Party. While I’m not a Corona Labs user, I am interested in learning all I can about mobile gaming, and with NativeX cosponsoring the party, I might learn something more about mobile ads. Hey, I’m new to GDC and can’t turn away from any opportunity to learn and be entertained.

Come Wednesday night I have two parties to attend; first up, I’ll head to the Novela Bar for a party sponsored by Kontagent+PlayHaven and VentureBeat. I’m starting to see a theme here that after we’ve hung out all day with the developers of the tools we work with, the guys who handle the all-important “after-the-game-has-been-released” job are there to talk to us. From here, I’ll walk over to AT&T Park – home of the San Francisco Giants, for a giant party hosted by YetiZen. A couple of live acts will be performing, tournament video gaming will run all night, and people on the VIP list will be able to test drive a new Ferrari.

Thursday following GDC, an event is taking place at Swissexsf titled “Spatial Storytelling: Augmented and Virtual Realities,” where we are promised an evening of immersive games and installations incorporating virtual and augmented reality. Some of the participants are Disney, Stanford Virtual Human Interaction Lab, OuterBody Labs, Apelab, and BandFuse. Sadly, the festivities are planned to stop at 10:00 p.m.

Finally, on Friday, the last day of the conference is a party hosted by 8bitSF and Pow Pow Pow called Band Saga LIVE. It will be over at the DNA Lounge with a bunch of bands, including Metroid Metal and An0va performing. I’m still waiting to see if Kiip responds to my RSVP to attend their party on Wednesday night. This will be tight as I’m sure that a “Things wrapped in bacon” party is going to be a popular one. And it seems the most difficult party to get into is Notch’s .party( ). Last year, this inventor of Minecraft had Skrillex play his party; this year, Nero, Kill The Noise, and Feed Me are scheduled to perform. The event seems to have filled minutes after it was made available. There’s always next year.

The Challenge of Building Virtual Reality – TimefireVR

Serafini Mesh for TimefireVR

Creating content for virtual reality when your aspirations are set high can be a daunting task. In order to move beyond the low-poly small bitmap textured environments that have come to typify video game art for the past couple of decades, the small indie team must focus on mastering a wide number of tools. The principles of this creativity are the same as they are for Triple-A titles, but instead of 300 people creating the world, you might only be one or two.

Since this past summer, a lot has changed as the game industry undergoes an amazing evolutionary convulsion. What stands out is the fact that hardware, software, and competition from the indies are at a critical stage of altering the business model and show no sign of letting up. The convergence of GPU rendering power, high-quality mobile display tech (the backbone of the VR headset), procedural and physically based textures, the Steam network, and various other platforms for content distribution, along with the myriad advancements in general PC hardware such as inexpensive ram and fast SSD’s that allow us to run Blender, Unreal, Substance Designer, Photoshop, and DayZ all simultaneously are giving the individual an incredible amount of leverage.

With this power and opportunity, I’m facing steep learning curves every single day. At first, I struggled with simply learning how to make seamless textures. Then, the long, slow arch of learning UV unwrapping showed up like Godzilla to crush my ideas that I would just throw these materials on my meshes. Because I’m looking to bring these skills to work with Unreal Development Kit (UDK), I was only focused on low-poly meshes. I had no game-creating experience, and so as I understood things, I needed to be super conservative with the geometry and size of materials.

Then Epic started teasing more details regarding the much anticipated Unreal Engine 4 and I started dreaming that it must be around the corner. Maybe I should build things with the idea that I’d migrate my work when Epic finally made the update from UDK to whatever they would end up calling the consumer version of UE4. Just as I was making this transition, I learned about Allegorithmic’s Substance Designer and their Database of over 700 procedural textures, and a new jumble of highly technical stuff was thrown at me. As quickly as I was adapting to this new paradigm of working with textures and materials, Allegorithmic added Physically Based Rendering (PBR) to the mix. I’d seen hints of this in one of the UE4 demos, so this just solidified the fact that this was going to be something important. Sure enough, here comes Marmoset with Toolbag 2.0 and a heavy emphasis on PBR, too. Better bone up on what this will bring to the workflow.

If that wasn’t enough, David Green with LilChips continued to offer me updates to his still alpha landscape-building software called Terresculptor, so I had to investigate those changes, too. An email pops into my mailbox from Allegorithmic inviting me to Los Angeles for a sneak peek into Substance Painter; I can’t resist. The first version is primitive, too primitive to work with, but great to steal a glimpse into what those guys have in mind. Luckily, I couldn’t deep dive into Painter yet and instead held my focus on other tasks while this software matured, but I now know that 3D painting is certainly going to be another tool in helping me achieve the kind of results that were exclusive to the Triple-A guys.

By late December, I was fully immersed in the process of building high-poly models, learning about retopology, and that auto-unwrapping UVs was not going to cut it. Displacement maps in modeling techniques looked appealing, and from what I was seeing from Surface Mimic, these maps that were being used in Substance Painter performed great in Blender and my sculpting tool of choice, 3D-Coat.

The problem was I was not making a lot of progress in building my world; all I was doing was learning all the time. With the New Year came a date on the calendar I needed to tend to: Steam Dev Days hosted by Valve. With a large portion of the Seattle-based conference focused on VR, I thought it might be worth the expense to learn more about this platform I was so fascinated with developing. As I wrote in a previous article, Gabe Newell made it possible for me to get a peek into their VR headset, where instantly I knew that not only was I on the right path, but this was going to be more epic than I imagined back when I first put on the Oculus Rift Dev Kit 1.

From Palmer Luckey, Joe Ludwig, Michael Abrash, and the rest of the people who were talking about VR during those days in the Pacific Northwest, it was made abundantly clear that large-scale environments weren’t ready for prime time yet. I didn’t want to acknowledge this because my hope was Nvidia would rectify poor frame rates with their new Maxwell line of GPUs, but slowly, I’ve given in to the idea that I would have to shelve part of the world for a time and find a different focus. Luckily, I found a compromise that would temporarily push my earlier efforts to the side while I focus on a “corner” that will allow for a tighter focus on the intimate instead of the massive.

Riding on the elation of what I’d seen in Seattle, I thought it was time to write an email – to Epic. Within 24 hours of that missive, I learned that I would be on the next cycle when new devs are brought on. I don’t think I could have been any more dumbfounded than I was at that moment. In about a week, I was downloading my own personalized (for security purposes?) version of UE4 (not its code name, but I’m not authorized to speak ‘its’ name :).

It has now been a couple of weeks since my head was spun around its axis, and things are starting to normalize if that is really possible. I’m again making progress in the creative department, but weaving all of this together is a gargantuan task. In three weeks, I’ll leave for San Francisco to make my first visit to GDC (the Game Developers Conference), where I’ll likely once again be overwhelmed by the rapid evolution that is occurring in this industry. None of this is a complaint; on the contrary, I’m too astonished right now to see any of this work as a burden. I stand humbled by the gravity of what is coming and am excited to see what is to be shared in the near future.

  • NOTE: For historical accuracy, I need to point out that the context of this blog entry from 2014 was changed from a “WE” to an “I” perspective as the person who’d been collaborating with me from that time became hostile to the point I felt it best to remove references to him. At times, I’ll reference him as “Redacted.”

La Réalité Virtuelle – TimefireVR

Antonin Artaud in Virtual Reality

The term Virtual Reality may seem dated; after all, it has been over 70 years since poet and actor Antonin Artaud first penned the words “la réalité virtuelle.” Although his 1938 definition may seem far removed from what our technologically advanced world is about to deliver, his ideas were far from off base. He envisaged the theater as a place where the alchemical mythologies of man would become the “incandescent edge of the future.” Well, that is the verge of where we are today with a light-emitting headset called Oculus Rift that will allow us to peer not only into the future but across time and reality in ways the mass of humanity has yet to fathom.

The transmutation from lead into gold was for alchemists what the crafting of story and image into content is for our time. Storytelling is an ancient art, maybe 40,000 years old, as dated by cave paintings in Spain. Early plays and dances have existed for thousands of years, but it wasn’t until the 15th century that the narrative became a tradeable currency. The printing press brought with it the ability to distribute information to the masses. For the next 400 years, printing would dominate our communication channels until the arrival of the telegraph, motion picture film, and telephone. Together with television, the moving picture would show people across the globe an unknown world previously only written about in tomes or told to one another in an oral tradition.

It has taken us tens of thousands of years to reach the juncture where information is ubiquitous and driving nearly all human activity. For example, the global publishing industry is now worth $108 billion, effectively making it the 60th largest economy if compared to other countries. The global movie industry is puny in comparison to approximately $35 billion, while video games generate about $92 billion of revenue. All of these industries are being wrapped up by the new kid on the block, the Internet. That virtual department store/library/theater is currently facilitating over $3 trillion of business – the Internet is the essential utility of our future.

What this all has to do with Virtual Reality is that we are at the point in time that marks the beginning of the future of humanity, just as art, printed language, and advanced communication did during their time. It is the convergence point where we enter hyperdrive. I make this prediction as though it was as easy as identifying a cave painting where the artist drew a horse, and we all know it’s a horse.

All of humankind tells stories, we all have histories, we all celebrate our past, and most of us have dreams of the future. In VR, all creative and consumptive lines converge. We meld together and share the written word, the image, the game, the transaction – and we do it in an environment that speaks to and puts on display the dreams that live on the “edge of the future.”

We are all about to be thrust into new roles as architects of this future. This will be a place of alchemical experimentation where mythologies will come to life, not as two-hour celluloid epics, but in places where we dwell and create new myths. Except, we are neither intellectually prepared nor technologically advanced enough for what we must start preparing for – now.

While knowledge is everywhere and readily accessible, how many of us revel in the acquisition of the abstract and intricate? Most of those I see are more interested in the trivial and mass-produced banal culture as doled out by faceless corporations concerned with shareholder wealth and executive salaries than in the evolutionary intellectual vitality of their fellow people.

Our next point of embarkation must be on the vehicle of high-level brain exploration. The technology to show each other our dreams is soon upon us, though right now, it leaves much to the imagination as it can only deliver a fraction of the aesthetic fidelity we are fast approaching.

To return to my statement in the first paragraph regarding what we understand or fathom, Virtual Reality will be a magnifying glass, a kind of tunneling electron microscope that will peel back the layers of the onion to expose things for what they are. We have always been visual learners and quickly pick up on what the image holds. It is within VR that the image will become ever more intoxicating as technology advances to render greater beauty and detail out of the abstraction of pixels. Humans give order to chaos; we set letters in sequence to form words, we align and contrast colors to create art, we capture fleeting images of light in movies and then stand back in awe and sometimes cry at what we’ve created. Here at the edge of the future, we will continue our traditions to make sense of things, and while I am still uncertain as to what VR is ultimately going to look like, what I do know is that at the other end of its trajectory, we will see a global society finally having achieved its Magnum Opus, we are on the verge of discovering the elusive philosopher’s stone.

The original image is available from Gallica Digital Library under the digital ID /ark:/12148/btv1b8539368j. This image is in the public domain because its copyright has expired.