Site icon Brief News

Otoy’s Jules Urbach believes the blockchain can produce a crowdsourced rendering engine


Once upon a time, you had to submit a program to the computer department. They would run it, and send the results back to you in the form of a printout. That worked well in the days when there was one big computing resource that everybody had to share.

In a way, Jules Urbach, CEO of Otoy (maker of the Octane graphics renderer), believes those days could come back, except the computer we are sharing is all of the computers on the Internet. He applies that computer history to the idea of cloud rendering. Rendering images or games is what happens when you submit your graphics or animation project, and the big computer, in this case a data center, goes to work on it. If you’re like Pixar, you might wait a whole day for your next animated movie scene to come back as a full rendering. Pixar can afford to pay for that, but smaller game makers and animators can’t.

That’s where blockchain comes in, and it’s why Urbach is starting an Initial Coin Offering for what he calls a Render Token. It’s a blockchain-based currency that people can invest in, as it represents a distributed graphics processing unit (GPU) rendering network. Everyone with a computer can contribute the spare cycles to a collective rendering machine, when their computers aren’t being used. Artists can submit their work to be rendered, and then they’ll get their job done.

The Render Token ICO kicks off on October 5. And if it works, then it will be like a brave new world, where people contribute their spare compute cycles to the cause and earn something from it. I talked to Urbach about this big idea.

Here’s an edited transcript of our interview.

VentureBeat: Do you want to talk about why you’re going the ICO route, and explain what Render is as part of that?

Jules Urbach: Render is a way of solving a problem that I foresaw years ago. There’s a patent from 2009 that reflected my view. The idea behind the token sale is something that I was thinking about well before I knew anything about the state of cryptocurrency and Bitcoin and Ethereum and all these ideas that have come out of the blockchain. I had always imagined, five or 10 years down the line, that things would be getting built through Octane for rendering. The rendering power for that needed to be more than one person could provide, or even a couple. And so I came up with a few ideas and patents around getting this to run through the millions of graphics cards that were out there.

Even some of our earliest work for Autodesk was built around peer to peer streaming and sharing, not just the cloud-based stuff. It was through our work through Autodesk that I got a really big break. We and Autodesk triangulated, in 2013, the G2s in the cloud. It was right before than that I went up on stage with Jen-Hsun Huang at the Nvidia conference and showed Octane running on hundreds of graphics cards, knowing that Amazon would have that available within months. We launched the cloud service with that in mind. Amazon was part of making cloud rendering as we have it now in Octane, leveraging the rendering for making some crazier things like lightfields possible. If you look at our current partnerships with Facebook, what we’re doing with video is still all basically Octane-rendered. With Unity we’re integrated primarily to drive our AR and lightfield and volumetric rendering business.

All those things are pushing what you need for Amazon to deliver through the roof. We have render jobs that are lightfield-based that cost $ 150,000. They could be cheaper, and they could be done on what’s on the public cloud. We’ve seen, over the last four years, the price of that not really change a lot. Our user base is in the millions now with Unity. I went back to the idea of—we’re going to keep all the GPUs we already have in our service, but we need a lot more. It needs to be cheaper. The best way to do that goes back to my plan to have millions of graphics cards contributing to that.

Above: Jules Urbach, CEO of Otoy.

Image Credit: Otoy

VB: When I talked to [Epic Games CEO] Tim Sweeney, he was talking about blockchain as well. He mentioned that, eventually, it’s something we want to render at 60 frames per second. Right now it’s updated about once every 10 minutes. That seems like an awfully big distance to go. How far away from that goal are we, technically? Is there something that helps close that gap to the point where this kind of rendering is useful?

Urbach: Tim is right about the 60 frames per second. Ultimately you want to be able to deliver those actions instantaneously. There’s work being done with Ethereum which is designed to basically create shards. They’re basically shifting Ethereum to having multiple side chains that can validate work much faster. That’s an important consideration for the future.

But what’s interesting for us is that I don’t need any of that for the stuff we’re doing. The first reason why is that if you look at this work as an extension — there’s a lot of work that Amazon and Google computers and GPUs are doing, not just for rendering, but for other things. If you can come up with a way to deliver more and cheaper compute power for the services you do through the central—the most expensive things we do at Amazon are not real time streams. They can be, but that’s a small percentage of our business. You remember what we demonstrated in 2013, and we use it for Oculus and Facebook and others, but the majority of the money and the majority of our capacity issues are all about high latency jobs.

In other words, people want a light field render within a few days. It would take them months to render it locally, if it’s even possible. You just need the work to get done. In the case of light-mapping images, which is used in making games with Unity and Unreal, that’s also offline. Typically a lightmap in a game takes about a day to generate before you publish it. Moving that to the cloud, again—nobody needs this in an instant. The way we built this system is to replace the one-day turnaround we have with something augmented. If you want next-day service and you’re doing 6K of rendering work, that’s where we need to talk further. That’s the big issue that I’m trying to unblock here.

Having transactions happen where we unlock 60 hours of rendering from one transaction, that’s fine. That’s a great way of leveraging the blockchain. That’s exactly what we need. What’s also interesting is that ultimately, a lot of blockchain issues — transactions happening at 60 cycles per second, that’s not what we’re trying to do. I’m looking at this as I’m building blocks of reality. We’re literally rendering a simulation of light. We’re at the point where we’re able to measure the benchmark for that so well — today, render farms, whether they’re rendering or not, they measure their efficiency and the value of what they’re delivering in Octane bench per dollars.

This token is an even smaller measurement of work, Octane bench work. What we deliver right now on the cloud service is in $ 5 increments. The cost is designed to go down. The cost is driven by the fact that the demand for rendering is so high that we have to fulfill a lot of the work through Amazon at the cost that Amazon provides.

Above: Color Noise by Linus Zoll

Image Credit: Linus Zoll

VB: To summarize a bit, then, the cost for cloud rendering has dramatically improved over the years, but it’s hit a wall of some kind? The cost hasn’t come down as fast as you would like more recently? Is this the next step to shake that loose again?

Urbach: Right. It’s on two fronts. I was part of the launch of Amazon’s GPU services in 2013, the ones we use today to do rendering. The price they launched at in 2013 is basically identical to the price today, even though they’ve upped the GPUs to the G3. The GPU work is no different from it was in 2013. That’s not great. We haven’t seen any serious competition to that pricing from Google or any others. That might change. It’s expected to. But that’s only one dimension of the problem.

The other one is, even with Amazon — there are tens of thousands of these computers available and it’s still not enough. When we could leverage thousands of GPUs per job—a lot of our rendering jobs are art assets, and a lot of them could be knocked out to a thousand GPUs. The problem is, as we grow our business and we get even 10 customers per day who want to render these kinds of jobs—that will happen with AR, as volumetric rendering becomes so important, and in games. We don’t have those GPUs at Amazon. So the pricing and the capacity are the areas that need to get solved.

What’s fascinating is that there’s another dimension of real world problems we have. You have so much GPU power being used today to mine cryptocurrency. We’ve found for the first time this year that our customers, they want to be running machines locally to render Octane faster, and the actual hardware isn’t available, because the miners have bought up all these cards and there aren’t any left. We look at how much power it takes to run Bitcoin — it’s larger than most countries, a huge amount of energy. It’s sucking away energy and GPUs, and it doesn’t do anything. Mining cryptocurrency doesn’t do anything valuable other than just creating a token. That gets more and more expensive every day, the more the blockchain grows.

One of the goals I had with this system — it’s just about the right timing for us to have it so you can make more money consistently from pulling that render job at that cost than you would from, essentially, pulling the lottery lever and seeing if you get an Ethereum coin. Right now, mining for Bitcoin, you’re not going to get anywhere.

Continue Reading …

VentureBeat

Exit mobile version