Evryway are available for consulting and development on your AR and VR projects. Talk to us at [email protected] to see how we can help you.
Good lord, it’s October. When did that happen? I’ve been doing some contract work since April, and that’s now finished. I’d love to talk about it, and I might do that in the near future - bits of the experience were great, but the ending wasn’t quite what I expected. As with the best stories in life, when doors shut, other doors open - it’s going to be an interesting few months. While I’ve been cranking away on other people’s stuff, many cool things have been happening. For example, Oculus just made a lot of noise about their new product, the Oculus Go which is basically an all-in-one Gear VR/Daydream type of thing.
A tale of Tango, ARCore and ARKit
I’ve been promising an article that’s actually useful for a while, and something that’s been really useful for me over the last few years is the Unity Cache Server. I’ve recently set it up to work with multiple versions of Unity, so that I have different caches for each version of Unity. Care about why and how? then read on!
Quiet in here, innit?
Presenting content in VR
My zeroth year comes to an end.
The dev blog is a little late this week, as Februaryitis kicked in. It’s far too cold to be sitting in my garage on a weekend typing up a blog post, and it’s been a busy one.
Evryway Scanner is now in open beta. Hurrah!
While the world marches on towards syncopation, I’ve been heads down, preparing Scanner for open beta.
This week I have been mostly replaying and occluding things.
This week, I’ve been looking at fixing some of the more obvious flaws with my texturing process, including through projection.
My first week “back at work” has been filled with consuming excess Christmas pastries and improving Evryway Scanner.
This week - holidays! Happy 2017 to you!
This week, I have been mostly doing holiday. Merry Christmas!
This week, I’ve been shipping. 4 downloads and counting! Oh, and scanning things.
42 - the answer to life, the universe, and everything. It also comes after 41.
This week I’ve cleaned up the texture atlasing a little, and spend some time getting Scanner ready to show to a slightly wider audience.
Normally the big updates are all progress talk. This week there’s some of that, and some other things too!
Texturing a model from two different camera views is this week’s challenge. And it’s almost working.
This week, I’ve been texturing up the mesh from the Kinect texture feed. I’ve also resolved a load of issues with SharpChisel’s marching cubes.
I’ve made good progress on the networking this week - I’ve got four devices all feeding point clouds into the same data set (two Kinect cameras, two Tango tablets). They are not aligned yet - that’s one of the three remaining big chunks of work before this is a product.
Another holiday week (love spending time with my family!) - so lots of digging on the beach, but no implementation progress. I popped in to see the Boss Alien crew and caught the afternoon of Tedx Brighton (thanks for the early birthday present, Nic). Next week, back to it!
This week I’ve got the networking issues mostly resolved, and point clouds coming over the network from the Tango device to the PC.
This week, I’ve done very little. Flatbuffers serialisation is working, chucking test data over the network using Flatbuffers is working, and that’s as far as I got.
This week I’ve been figuring out how to send large chunks of stuff over the network in a maintainable fashion, which has boiled down to fighting with serialization and structure/schema definitions. Not a fun week, all told.
I’ve been somewhat distracted this week, but I managed to spend a couple of days getting a point cloud feed working over the network. The point cloud feed is very heavily throttled (something like 1K points a second currently) which means the results take a long time to look decent - but it’s working. Apologies for the “lots of words” update, I’ll have some cool videos next week.
This week - Pointclouds and preparation for networking. I’ve also been building a new PC.
Meshing speedups and Tango upgrades, this week. even some video!
This week, I’ve finally written my own meshing solution. Read on for details!
My implementation of Chisel is coming along nicely. Most of the re-write work is now done - this week I’m hoping to spend a bit of time taking depth frames and point clouds from Tango and Kinect and start testing it out properly.
Moar holiday! and a start on implementing a C# version of Chisel.
Half a year down, and where are we?
Another week spent enjoying the summer holiday with my family. It’s been awesome.
More holiday! I’ve been enjoying the sun and time with the kids.
This week - holiday!
This week I’ve spent a tiny amount of time looking at blending between panoramas, and a big chunk of time trying to get toolchains working.
A fun week this week! Not much implementation, lots of talking. I caught up with some friends down at Develop in Brighton on Thursday, including a quick visit to the Boss Alien Mothership. Lots of great things happening there, and lots of interesting conversations with folks about what I’ve been up to and where they think it could go.
Not much to report this week. I’ve been cleaning up the codebase in preparation for a couple of new tools and a new approach for projecting entities in the world.
This week has been a bust, from the point of view of progress on the holodeck. No pictures or videos until next week - sorry!
This week, I’ve been putting my R200 scanner to use capturing human body poses, and attempting to get them to animate. Videos and stuff below, read on.
This week I’ve continued exploring Avatars, getting a rigged / skinned mesh up and running in Unity, driven by the Kinect2. I’ve also been looking further into RGB-D sensors and scanning software for facial and body capture.
How does one represent themselves to the world? A tricky question, both physically and metaphorically. The apparel oft proclaims the man, according to the Bard. This week, I’ve started to explore Avatars.
A holodeck ain’t much fun if it’s just you. Or rather, it’s loads of fun - but it’s more fun with friends! This week I’ve been looking at how to chuck stuff around the network, starting with a simple painting prototype.
Very quick one this week, as I’ve got very little work done!
I’ve always wanted to play God, and I’m a huge fan of god games. One prototype I’ve had on my list to make for a while is some form of solar system formation simulator. I’ve taken a very early stab at it this week, with some lessons learned.
I’ve spent a big chunk of this week looking at 360 video, plus I got Photon networking dropped into the Portals prototype. Videos below!
This week I’ve been enjoying the sun and the family time. I’ve also spent a little time working on a game prototype on the Vive and finished up my codebase refactor so I can spin up new prototypes quickly. Details below.
This week has been a slow one. Apologies for the late post!
Week 9 - two months in (and a few days!) It’s as good a time as any to look at goals (short and long term) and figure out what to do next.
Portals, Vive, multiplatform - what an exciting week!
I’ve now added portals to the holodeck. You can see through them, and walk through them. Video below.
This week, I have been improving mesh capture and playing with my new Oculus Rift. Videos below!
This week, I’ve been taking a well earned rest and spending time with the wife and kids. I’ve made good progress with the atlasing, but the on-device performance of textured capture is pretty slow with my current process, so some work needs to be done to split work units over frames, that sort of thing.
Constructing realtime texture atlases is trickier than I expected. No videos this week, mostly because there’s nothing terribly cool to show.
Today, I’m intending to get the textured mesh up to a much higher quality. There’s lots of things to fix (LOTS of things). One of the key parts of this is being able to examine the mesh somewhere other than on device, so I’ve written an exporter.
Capturing the world in realtime into VR? done that. How about with texture mapping? Videos below!
Realtime textured mesh into VR - It’s working. Like all things @Evryway so far, it looks pretty rubbish and there’s a huge number of issues and potential improvements, but the first steps are always the most exciting. Video below!
This week has been an exploration of walking around house-scale VR using Project Tango. It’s raised more questions than answers, and this is the point where the real hard work begins. I’ll briefly cover where I’m at with Cameras, UI, and the realtime environment capture work I’ve done so far.
Today, I’ve got realtime meshing working in the Holodeck. Video below, technical details below that!
Here in my garage, I have the world. No Lambourghinis (yet). Not even virtual ones.
I’m wearing a VR headset. I can see stuff in the world. Where am I?
This week has seen the pending release of probably the two extremes of VR hardware. HTC’s Vive is now officially available for pre-order - and McDonalds is creating a Cardboard-a-like device using your chip holder (or fries holder, for my US visitors).
Welcome to my “How to build a holodeck - follow along at home with sticky tape and scissors” dev blog. This is week one, which means we’ll be figuring out how to do this (the process) as well as how to do this (the content) as we go. I’m hoping to make this a regular thing - and that really depends on whether there’s enough readers to make it worthwhile. If you like this, share it and tell me!
Evryway is experimenting with virtual and augmented reality with one primary goal - to bring you and the world closer together.