META This! Series Ep.3 - David Talks Metaverse with Jason Fader, Also Chat GPT, NFTs, and the Game Industry’s Future.

Illustration by @max_gps

META This! Series Ep. 3

Podcast Transcript:

David:

Hello and welcome to the Nolan Heimann LOOK.legal podcast. This is the Meta This series, and today I have with me Jason Fader. Jason is an expert consultant on emerging technologies. He's a senior game developer with over 25 years' experience in the game industry. He is currently the CEO of Iocaine Studios and is working on an independent game project, and he is a personal friend of mine. I asked Jason to come on today to talk to us because I've had a number of conversations with him about emerging technologies and always value his insights. So Jason, welcome.

Jason Fader:

Hey, David. Happy to be here.

David:

Thank you for coming. So I wanted to start with the same basic question I'm asking everybody, which is, in your view, what is the Metaverse?

Jason Fader:

Ah, what is the Metaverse? The age-old question as of the '90s. I mean, I could define it like how Neal Stephenson had it in Snow Crash, but I prefer my own definition because I just like shorter definitions. So I would say the Metaverse is simply digital communities of content creators and content consumers engaging in activities. You can almost think of it like a theme park or more like a mini golf course that has a bunch of rides and arcades around it, and users are basically contributing and building these systems that other users can participate in. And when I say user, that can also mean businesses. Basically everything's fair game in a proper Metaverse where you can create the content, consume the content, and potentially profit from it.

David:

So how is that different from a website or a game or existing social media?

Jason Fader:

I'd say the main difference is accessibility and the spectrum of users that could be attracted to a metaverse. One of the earliest and easiest metaverses to talk about is Roblox. Not a whole lot of people saw Roblox as a metaverse early on, but it definitely is one, mainly because it is a digital community of content creators and content consumers and they're all on the platform of Roblox.

What Roblox does to empower users though is they have a lot of editing tools that don't require a whole lot of content creation knowledge. So very user-friendly tools, minimal barrier to actually incorporating all this content into these systems, and incentives for users to do so. They can profit from it, they're monetized, and everybody likes making things. It's always fun to be creative. And so they're tapping into a lot of that... your inner Lego.

David:

Yeah, Roblox in particular, you get a lot of kids on that platform, so you're engaging a young audience. So when I think about the Metaverse, you often think about things like Neal Stephenson's Snow Crash or Ready Player One, and those are 3D experiences. They're immersive experiences. Do you think of the Metaverse as needing to be 3D?

Jason Fader:

Not necessarily. As long as users are engaging, communicating, socializing, and enjoying content that was not necessarily made by one person, I think it qualifies as being a metaverse. It's the fluidity of content and the multitude of choices that define a metaverse in my opinion. And so you don't really have to have it in 3D. It doesn't even have to be in VR. Oftentimes VR users call this pancake mode where they're playing something on a monitor and not on their headset because a pancake is flat.

You can engage in these pancake experiences pretty easily. As a matter of fact, last weekend I was at a friend's birthday party in a metaverse. We were in a metaverse platform called Tower Unite, and it was I'd say about 10 of us. We all got online, talked in voice chat on Discord and hopped in this platform, played miniature golf together, did a bunch of other fun marble race games, chit chatted, and everybody was in their own custom avatars that other users had made.

And there was at one point, and this is a first party product from the Tower Unite people, they just incorporated almost like a Ghostbusters-type haunted house thing where it's a roller coaster, more like a slow roller coaster ride. You're in the ride with four friends, you all have these blasters and you shoot at things in the haunted house, all of you getting a score. And it's like one of those fun experiences when you go to a theme park like Disneyland on one of those Toy Story rides where you're in a car and you have a gun and you can basically rack up points by shooting sideshow things.

And yeah, overall it was... It's not the same as going to a mini golf course in real life, but the experiences, the friendships, the conversations, I'd say it's about 80% close to that experience. And if you're an introvert and don't like going outside, it's perfect.

David:

Yeah. So talk to me about the actual user experience. When you're doing this, are you sitting at your desk maybe in your pajamas with your headset on, on the screen and actually live chatting with the people you're with?

Jason Fader:

Yeah. So I use a normal computer, could use a laptop. The only requirement is whatever you use has to have a microphone so you can communicate with people, although that's also optional. There's a lot of support for hearing impaired and speech impaired individuals and there's tools to allow users to communicate even if they can't talk or don't have a working microphone. They can use their keyboard and then a text bubble will pop up over their avatar so other users around them can see what they're saying.

And it's a lot of the accessibility features and quality of life features that really help bridge the gap when it comes to just a lot of these technologies. Yeah, ultimately it might be getting a little bit off topic, but I just wanted to say also about these experiences that you don't need much to get into it. You just basically need to know what you like and what sort of groups you want to engage and interact with. There's a lot of interest out there in a lot of communities, and if you find the right one, you could find a new friend group or new networking connections.

David:

Actually that's an important point because I think sometimes when people think of the Metaverse, they think of it as one unitary thing, but in fact, really when we're talking about the Metaverse, and you've already alluded to this, it's multiple different platforms and pieces of software, places you can go. Can you just talk a little bit about the different types of metaverses out there?

Jason Fader:

Sure. Some of these platforms may not necessarily call themselves a metaverse, but they pretty much are, and you find a lot of them mainly on VR platforms because that's where a lot of the interest is. So I would say the biggest VR metaverse platform out there is VR Chat. And then there's others that are also supported on things like the Meta Quest 2, like Rec Room, which is as more family-oriented version of a metaverse where it has a lot of activities you can engage in with groups. And then there are deeper metaverses that are, I would say, for power users and content creation enthusiasts.

One of the biggest, I'd say most feature-rich platforms that I've seen is one called NeosVR. It's done by a very small team and it basically empowers users to create contents within their worlds using their in-world tools, which is something that I would say other metaverse communities like Roblox or even VR Chat still need to catch up on. Basically when you go into a world, it would be nice if you could just start building as you are there without having to download any external tools.

Going back to other metaverse platforms, there's quite a few more to list. Helios is another big one, and they're using on Real Engine, and so their visual fidelity is awesome. There's Chill Out VR, and then there's others that are starting to experiment with VR support, like Roblox is getting into it. But essentially, these are platforms and you can call them metaverses plural, but really the important thing is the communities that are within these platforms. And so you'll find a community that's in VR Chat and they'll also have users that are in other platforms. But the thing that ties all these users together is their community or a club depending on where they want to go.

So for example, last week at my friend's birthday party, we decided to go into Tower Unite, but sometimes we're in VR Chat, sometimes we're in NeosVR. It just depends on what everybody's in the mood for and what the community wants to get done at the time, if we're celebrating somebody's birthday or building something in the Metaverse.

Another example is last year, a year and a half ago, they have maker competitions in these metaverses and I participated in one. It was called the Metaverse Maker Competition, and that took place in NeosVR and I was with a team of 12 other users all across the world, by the way. Since it's on the Internet, there's really no national barriers, just language barriers. But luckily a lot of folks speak English, which was great for me because I'm not very multilingual.

So we basically had a month to essentially create something, whatever we wanted, and it fitted in certain categories. We decided to make a hotel in the Metaverse, something that would have a lot of good visual ambiance, nice relaxation spots, some sideshow games and an arcade area. And we basically assembled all of this within a month using assets that 3D modelers made, scripting that programmers on the team were able to do within the platform, submitted to the competition and made it as a finalist.

It was an incredible experience, mainly because it was such a diverse group of people entering all over the world, but all converging within this platform, engaging with each other. And at the end of the competition you walk through and see everybody else's creations and how amazing all the stuff they built is. And it's really an experience that I don't think I've felt in the real world at that scale, at least as a participant and also as a viewer.

And so I think there are just so many possibilities when it comes to the Metaverse as far as engagement with users, and I think we're just on the tip of the iceberg with it really.

David:

I want to shift gears a little bit. Over the last couple of years, NFTs have been hugely hyped. It seems to be cooling off now, but in the context of the Metaverse, do you think that NFTs have a purpose?

Jason Fader:

I would say they do. I think, like most new technologies, it wasn't immediately understood how best to utilize NFTs. And unfortunately, I think the early usage of NFTs, probably the most obvious one to a lot of people, which was to monetize them by selling board eight pictures. But I don't think that's really the right way to utilize NFT technology.

When I first started getting into blockchain tech and researching this stuff, to me, how I felt NFTs should be used was more of the reverse approach to how we currently do it. Instead of a company selling NFTs to users, I believe users should be able to create their NFTs and exchange those with other users, and the platform that they exchange the NFTs on, those could maybe charge some microtransactions if they want.

But I think the real secret sauce to NFTs and what the spirit of them truly means, it's basically just proof of ownership. It's a way to track the chain of custody of an object or a thing basically, almost like a certificate of authenticity. So you're know you're not getting a knockoff handbag. This handbag was made in the Metaverse by a user by the name of Armani. And so now you know it's actually proper and authentic.

And when it's stamped as an NFT, there's other things that we can do in a game or a platform where we can track who owned it over time. We can also track with the NFT what the rules are that the creating user wants to do. So for example, if I am working in this cool metaverse Roblox sort of environment, and I built this amazing house and hypothetically, not Roblox itself but something Roblox-ish like that supports NFTs, and I create this house. I move all these little parts around and I have something called a house.

So I save this house out as a blueprint, or an NFT. And I specified in this blueprint, want users to be able to create this, but if any user does use it, I want 10% of the resources that they use as a commission so that the creating user can get a kickback from it. And so then when people are browsing this online store of blueprints where they can build something, they see one they like, they go to build it, and since they're building something from an NFT and we know who owns the NFT, we can then credit the original creator of that thing and reward them for coming up with a cool design that other users really like.

I mean, the ecosystem I envision is basically one that empowers users to transact with users. I'm sure businesses could also create their own things that can engage with users on a transactional basis. But to me, really, I prefer more I guess the open source model of things and even crowdsourcing model of things where it's the users that are empowered to create the content, and the users can profit from the content. And if anybody else wants to profit, just make content.

David:

I had shared with you an article about an interview with I think it was Tim Sweeney from Epic, and he was talking about-

Jason Fader:

We call him The Tim.

David:

The Tim. He was talking about the future of this technology and how there may be cross-platform standards. It was exciting to me because it opens up, I think, a whole new world where the user content you're talking about or even corporate-created content could be brought from platform to platform with those standards. Can you explain how that would work?

Jason Fader:

That's another dream with a lot of this blockchain tech is asset ubiquity. So if there's, let's say, a cool Star Wars marketing tie-in where I can get an outfit out of Andor from the Andor show and I purchase this outfit in Roblox. It is a Disney NFT and I own it because it's an NFT, so I purchased it. And if there's another platform that I go into that also supports Disney NFTs, then I can use that outfit in there as well. It basically is a way of saying this user owns this asset regardless of what platform they're in as long as your platform supports it.

Nothing's really doing that yet, but another one of the goals is to find a way for first parties to sell their licensed products as NFTs to users while ensuring that users can't pirate those things and also the platforms that they're in properly display those assets so that it respects the original license holder.

I haven't seen a whole lot of things happening like that yet, but I think it's only a matter of time. And once those standards do come online, and I believe that we are close to them, and Tim Sweeney is a great visionary when it comes to a lot of this stuff because he's right, we do need clear standards to unify a lot of these systems. And whenever we have standards, the best place to start is open source standards. That way nothing's proprietary and we have the content creators all adhering to a open source standard that's mutually agreed upon by the open source community.

David:

So I think there's an important implication there, and I want to make sure it's clear as I understand it. So right now, the way it would work is if you, for instance, wanted to go into Fortnite and have a Disney costume, you would log into Fortnite, you would go to the Fortnite store, you would buy the costume there, and you would wear it in Fortnite. When you leave Fortnite, that's it. If this were to work, it actually flips the model where instead you would go to Disney, you would buy the NFT of the outfit, and any metaverse you go into, if it's Roblox, if it's Fortnite, if it's Minecraft or anything else, if it's compatible with the Disney NFT, you would have that same outfit in each of those platforms. Is that right?

Jason Fader:

Mm-hmm. But there's an even smoother way to have that transaction, still allow the user to purchase it in Fortnite and just have a flag or an icon on it that says this is a universal asset or an NFT asset. And so if you'd like to know where else this is compatible, click on this link and this will take you to Disney's website that shows all supported platforms that support Disney NFTs. That way you keep the user within a single ecosystem and that reduces the friction to purchase.

And so it is dependent on other games or platforms supporting those NFTs, but if there's a standardized system of doing it, it makes it more possible. The only downside though is that with each of those platforms, the asset, so if we're talking about an outfit from Star Wars, that outfit needs to be created specifically for that platform to match that platform's visual style. And so there is a cost for the developers of those platforms to support all that, which one of the biggest sticking points is, why should we make assets for these things when we're not getting paid to make those things yet? And so that's currently the thing that needs to get figured out is, how do you get people paid for their efforts in supporting something? And there there's no easy answer to that yet.

I mean, there are possibilities where you can track which users have purchased the NFT, and then based on who purchased it from a certain store, like if you purchased it from the Fortnite store, then since Disney would be the license holder for that outfit, then some of the money would go to Disney. And then after that point, then any other subsequent users that are using that on other platforms, there's a way to track who's using it and so then Disney can then, I don't know, assign royalties or some profit share to other platforms that are supporting Disney NFTs and then Disney can track where their NFTs actually are being used.

So it's possible. Technologically, it's doable. It's just complicated right now.

David:

It's the NFTs that make it possible, right? Because it's on the blockchain.

Jason Fader:

Absolutely. The NFTs don't exist on anybody's one server independently. It exists in the decentralized blockchain. And so once you have an NFT and it is assigned to you in your wallet, it's yours, or at least the proof of ownership is yours. You don't own the actual 3D asset, you just own the rights to use that asset. That's what that NFT says. Whereas if you're a content creator and you create, like I was saying in the blueprint example, and you stamp that out as an NFT, yes, you do then own that based on the rules of the platform that you're in and how they allow for NFT minting.

There's a lot of caveats to it, but these things are getting solved, and so it is getting a bit smoother as we go.

David:

There was a line I used early on when people were selling NFTs that it's a bit like owning a star. You have a nice certificate that says it's yours, but you can't do anything with it and everybody else can see it. I think when it's images, that makes sense, but it's not of much value. But when you talk about owning an asset or having rights to use an asset, really, in a platform, then that certificate becomes critical.

Jason Fader:

Mm-hmm. It's almost the equivalent of having a digital license that has extra data on it that allows you to verify the owner and also see who the previous owners were. So if you're not happy with your NFT Star Wars outfit, you could sell that to a friend or give it to a friend and now they have it, and the NFT doesn't care who owns it because you've essentially just passed a certificate of authenticity onto somebody else.

David:

Right. So as much as the bloom is off NFTs, it is now on artificial intelligence, AI.

Jason Fader:

Oh yeah.

David:

And you and I have had exciting conversations about this, so I want to talk to you about what role AI plays in the Metaverse and how you see this new technology emerging.

Jason Fader:

So when we say AI, there's a lot of different things that we could be talking about. I'll just basically cover the generative AI models and the transformer models. By the way, for those that don't know, the GPT in chat, GPT stands for I think it's generative pre-trained transformer, where transformer is basically the large language model tech that helps make all this stuff possible. So in theory, you could say thanks to transformers, we have AI.

But I'm only going to talk about basically the chat AI models and the image generation ones and to some extent the voice generative ones. So in the Metaverse right now, AI is not utilized too well, but it's starting. So for example, recently... Roblox's editor is called Roblox Studio. They just integrated support for generative image creation using prompts. So if you're familiar with technology like Mid Journey or Stable Diffusion where you can type in a line of text and then it outputs an image that matches your description, they now have that in Roblox Studio.

So a user could go into a Roblox Studio, create a new map for Roblox, and if they draw a cube down, they can click on that cube and then pull up an AI option, start typing in a little text box, and this text box will give them textures. So they can say, basically, I want a texture of a wood that has moss growing on it. And so after they type that in, they'll see on one panel some options for what that particular texture can look like. They'll pick a texture they like, and then Roblox Studio will apply that texture to the object that they have in the world. So users can use AI to create dynamic textures based on a prompt that they're giving it.

That's an early phase of it. Where things are getting crazier now, though, is using the ChatGPT-style technology within these systems. So for example, one way that metaverses are currently benefiting from ChatGPT are the integrations that it can have in 3D modeling programs like Blender. Some demonstrations have shown that there's a plugin you can get for Blender where you can type in the chat box what you want the model to look like and ChatGPT, since it understands how Blender works through its plugin system, which I'll talk about in a second, ChatGPT then interfaces with Blender and constructs a model as best to its abilities from how it understands to use Blender based on your text prompt. So right now, users are demonstrating it with fairly simple examples of create eight cubes spinning clockwise. But the secret sauce of this is the ChatGPT plugin system.

What plugins enable ChatGPT to do is it basically teaches ChatGPT how to use other applications. And so since Blender is open source, what the Blender community could do is they could essentially point ChatGPT at Blender source code, ChatGPT can analyze the source code, and then from that and a brief description of what Blender does, like a paragraph description, ChatGPT essentially figures out how to use that tool, how to use that application. The same thing could theoretically be applied to Photoshop.

Since ChatGPT is a very, very good programmer, it knows how to understand code very quickly. And so once you point at the source code of something, it then understands it. And once it understands an application, it can then use it and sometimes it can use it better than a user can, but it still needs a bit more refinement.

So where all of this is going towards is a seamless system where you're in a metaverse and all you have is a text box or not even a text box, you talk to it and it can do speech to text. So you talk to an AI assistant in Fortnite, once Fortnite expands to more metaverse things. Let's just call it FortVerse. So you talk to an AI assistant in FortVerse and you basically say, I want a map that gives me a jet pack and a grappling hook that I can swing around and collect coins Mario style. And so all you tell it is that, and then within a few minutes, after it's thinking about it, it then gives you a map in the FortVerse that does exactly what you wanted it to do.

Now since you weren't very specific in the prompt, it's going to have a lot of leeway as far as just what it can generate, but you can then refine it from there. You can continue talking to the AI assistant and say, give the jet pack two thrusters instead of one.

In some ways, you're essentially talking to the computer in Star Trek on a holodeck where you're describing what you want to see. And then the AI's job is to interpret what you describe and deliver those results almost like a director talking to their subordinates and directing an art project or a movie or a game where in order to communicate with the AIs, you still need to know what you want. You need to have some style and direction in mind. You need to know roughly how things work. But once you know how to describe what you want, the AI can take it from there. And all your job is from that point is to just review its submissions like you would if you were outsourcing something to somebody else and then provide feedback if you want them to iterate on it or work on it, just like you would for outsourcing. So a good way to look at the AI assistant is it's an outsourcing partner that you don't really have to pay too much.

David:

So just to drill down on that a little bit, I know one of the popular things going on in Roblox right now is users creating games that other players can play. For example, I actually had Justin Hawkins on the last episode and he was talking about how they created a Forever 21 Shop City experience in Roblox where people could create their own stores and sell products, and there was actually interaction with the real world. All of that takes programmers right now, who can spend the time creating the assets, writing the code. It's a laborious process that takes teams of people. How does AI change that?

Jason Fader:

Very easily now at this point. So even at this point, you can use ChatGPT to help automate some of this stuff because in order to get a store in Roblox, well, first you need to have the 3D assets. So you need to contact a 3D modeler to make those assets the traditional way. But if you're using ChatGPT... And this technology is evolving so rapidly that I imagine within a few weeks, it'll be even more automated with 3D creation tools. But what you can probably do right now is if you have a ChatGPT account, download Blender, download the Blender plugin for it, get it to output 3D models.

Or probably another easier thing that could be done that users do, even before there was AI, is use open source models, which has been a great resource for a lot of these Metaverse communities before AI was able to create some of this content. But thanks to AI becoming more powerful, you could just type a prompt in Blender saying, give me a storefront for Forever 21 based on these pictures of Forever 21 stores. So you're giving it input and now it understands what a Forever 21 store looks like, and the more pictures you give it, the more it learns.

Another cool thing about GPT-4, though this isn't available to everybody yet, is that it can understand images. So you can give GPT an image and then it knows what that image is. So you can feed this thing 10, 20, 30 pictures of the store from the inside, the outside. It has a good understanding of what it is. Tell it to make the model in Blender. It'll then create a rough model in Blender for you. Talk back and forth with it to refine it so that it looks good. Then once you have a model that you're happy with, you can then directly import that into Roblox because now you have the model. So you don't really need programming knowledge, but you do need to know how to move one asset from one platform to another, and that's something you can learn within an hour of time with a very, very beginner skill level.

Once you have that in there, then it just comes down to the scripting part within Roblox to enable that storefront to have the functionality that you want it to. And so ChatGPT, since it's been trained up to 2021 with its knowledge base, it already knows how to put things in Roblox if you ask it. So it can give you a step-by-step guide for what to do next and how to actually enable all of those things, and you just need to copy/paste the stuff it's giving you or just follow its instructions, and it has essentially replaced the work of probably a team of 10 or 20 people.

David:

So I want to use another example going back to when you were talking about Photoshop because I think that's easier for people who are a little older to understand. They probably don't know what Roblox is, but they probably do know what Photoshop is. I've learned a little bit how to use Photoshop. Many years ago I did a training program. It took me hours, and what I did was I took a picture with a lobster and I had to brush a second lobster in. It probably took me a couple of hours to learn all the tools and figure out how to do that. If you were to feed the Photoshop code into ChatGPT, how does that change that process?

Jason Fader:

So once something is enabled with a ChatGPT plugin, that empowers ChatGPT to utilize it as a tool. And so if there was plugin support in Photoshop or ChatGPT, you'd need to enter your ChatGPT API key, which is how they identify that it is coming from your user account. So that way they know who to bill because ChatGPT is not free. You need to pay them per thing, but it's fairly affordable.

So hypothetically, in this Photoshop plugin, what you could do is just type in in a text box with an image already there. So let's use an example of, I want to remove my brother-in-law, because he divorced my sister, from a photo and he's a jerk. So I put the photo in Photoshop and in Photoshop, I may be like, just do a little circle around where my ex-brother-in-law is and where I want him to not be, and then I just type in the prompt, remove this person from the photo and make it look like... And if they were in the middle of the shot, that would be weird. And so I'd then put in the prompt, make it look like the remaining group of people in the photo are close together and that this person never existed and never owed my sister money. So after you press enter and do all that stuff, the person is gone and you have a photo and hopefully better memories moving forward.

David:

It's an amazing change in game development. It could really revolutionize the way people make games and the way users generate content.

Jason Fader:

It's already starting. There's a popular game engine called Unity. Right now, Unity and Unreal Engineer are the two most popular engines. Unity recently announced Unity AI. They didn't reveal a whole lot of information about it, but if it's anything like the early ChatGPT plugins for Unity, it's going to basically empower game developers to do a lot more with less. So in that text prompt example, if you have a text prompt in Unity that can plug into ChatGPT or Unity's AI, you can type into it, like I was saying with the Roblox example, give me a game that has a jet pack and a grappling hook swinging from blocks and collecting coins. It can take that prompt and then give you a game that does that. It won't be fun yet, but it gives you a starting point. It gives you something to modify. It gives you something to learn from because the code it's going to generate is going to follow best practices from Unity.

So this will also enable junior programmers to become better because they'll see the right ways of doing things and because every engine is complicated, complicated things to be simplified. Because the AI understands how to use the engine better than you, you could just ask it questions.

For example, a common thing that most games need to know are how do characters move from one point to another without colliding with one another? And you can just have the ChatGPT or AI solve that problem for you by adding collision and pathfinding to the characters themselves and giving you the boiler plate code to do so.

The big performance thing about all of these AI models is that they're very, very good at programming. And so we're quickly getting to another age of programming where, let's say, in the '70s and '80s it was very functional and low-level programming and then scripting languages became popular in the '90s and up to today. And now we're reaching the next abstraction layer above those programming systems, which is AI prompting, so the ability to communicate effectively with an AI service to get the results you want in as few words as possible.

It is a loose skillset, but it's still a very, very new skillset because talking to an AI is supposed to be using natural language, so talking to a person. So really, it is just leveraging soft skills, just knowing how to articulate what you want to another human and ChatGPT then knows how to interpret it because it's been trained to interpret things as a human. The barrier to entry of knowing how to convey what you want is significantly being reduced, and we're basically going towards that future where you just pick up your mouse, talk into your mouse and say, hello, computer, give me this. And it'll give you that.

David:

It's like the real world is catching up to Westworld a lot faster than expected.

Jason Fader:

Yeah, dangerously fast.

David:

Yes. We're out of time, but thank you, Jason. This was fascinating. I'd love to talk to you again about this sometime, but thank you and I really appreciate your insights.

Jason Fader:

Yeah. This was fun. Thanks for having me.


Previous
Previous

META This! Series Ep.4 - The Voice Sings a Virtual Tune in the Metaverse

Next
Next

POTENT Ep. 4 - Exploring Business Opportunities and Peak Consumer Experience to be Found in the Cannabis Community.