Privacy by Design w/ Chris Foster

We talk about end to end encryption with our first guest. Chris Foster, CTO of Two Story Robot, takes us on a deep dive into cryptography.
Show Notes
We've deliberately chosen to design privacy into Clinnect. This means using cryptography to ensure that only the intended recipient is able to view patient data. In fact, as builders of the software, we can't even see the patient data.
For the curious, Chris suggests these articles to better understand cryptography:
We highly recommend using a password manager like Last Pass to keep yourself safer on the internet. Many are free, including Last Pass.

Fact Check

The LifeLabs hack was one of the largest data breaches in Canadian history. An estimated 15 million Canadians were affected.

Find Us Online


Credits

Produced by Jonathan Bowers and Angela Hapke
Music by Andrew Codeman (CC BY 3.0)

Transcript

Jonathan:  Check this out, Chris. So we've got these new pop filters. This is it. Without the pop filter, Peter Piper picked a Peck of pickled peppers.
[00:00:09] And with the pop filter, Peter Piper picked a Peck of pickled peppers
[00:00:15] Chris: So much better.
[00:00:16] Angela: Isn't
[00:00:17] Jonathan: then better.
[00:00:17] Chris: I feel a little bit like the black sheep, because I am I'm that person who joins the podcasts and does not have a high quality bike. And I know as a listener, whenever I hear that, I'm like, Ugggg!
[00:00:30] Angela: Do you? Because I'm more like, Oh, thank God. Not everybody has everything in their house.
[00:00:38] Chris: I usually just skip podcasts that, that are guests like me.

[00:00:45]Introduction

[00:00:45] Jonathan:  Hi, I'm Jonathan Bowers
[00:00:49]Angela: and I'm Angela Hapke. And I went camping for the first time with my family. Last weekend, we bought a
[00:00:57] new tent trailer
[00:00:58] Jonathan: the first time ever.
[00:01:00] Angela: with all four of us. Yup.
[00:01:02] Jonathan: Oh, wow.
[00:01:03] Angela: Yeah.
[00:01:04] Jonathan: anyone get any sleep?
[00:01:05]Angela: So we bought it a popup trailer and Brad and Alex were on one side and Nora and I were on the other side. One half of the trailer got sleep. It was not my side.
[00:01:19] Oh, I promptly when I got home ordered memory foam, like two inch memory foam toppers for the mattresses, because both Nora and I were like, Oh, heck no, we're not doing that.
[00:01:34] we joke that our children are like drunk octopuses, trying to search for their keys when they're sleeping at night. Like that's a bit how Nora is. So yeah, it was a lot of like toe kicks to the kidneys and moving around and yeah, it was tough.
[00:01:51] Today we have a guest, uh, the chief technology officer at Two Story Robot. Can you introduce yourself?
[00:01:58]Chris: Hi, my name is Chris Foster. I'm like you said, the chief technology officer at Two Story Robot. I have been building web applications for about a decade now. Um, and before that I was into computer security, pretty heavily. I have a degree in computer science with a specialization in software engineering, as well as a graduate degree in computational neuroscience and artificial intelligence.
[00:02:29] Angela: Oh my goodness. A lot of those words didn't make sense to me, but that's
[00:02:36] okay.
[00:02:36] Jonathan: you said, computational neuroscience, that's an obscure term that. So what, what does that mean?
[00:02:41] Chris: yeah. We use machine learning models to better understand how language is processed in the human brain.
[00:02:47]Jonathan: How did you do that?
[00:02:48]Chris: We put some people in a very uncomfortable machine. It's called a EEG machine. So. They put a whole bunch of goop in your hair and sensors. And then we make you sit in a dark room or of what feels like a very long time staring at symbols on a screen, as you learn to map those to English words. Uh, we tried to replicate sort of replicate an experiment that was done with a much, much more expensive machine.
[00:03:12] And then we showed that you don't necessarily need the $1.5 Million machine and said, you can do it. Uh, with something that's more in the range of $60,000. We did it while trying to learn kind of a language that we made up, which was something that was new too.
[00:03:25]Jonathan: That's cool.
[00:03:25] Angela: That is cool.
[00:03:27] Chris: it was a fun project, but yeah, I definitely nothing like graduate studies to also make you feel like you have no idea about computational neuroscience, more questions than answers at the end of it, it often feels like.
[00:03:39]Jonathan: You've expanded your knowledge a bit, but you've also expanded that surface area of things, you know, that you have no idea about. Um, which I like, I like that feeling. I like knowing that there's all this world of things that I don't know, uh, it feels like a better place than not knowing that that stuff exists.
[00:03:55]Um, It's it's something that I talk. So I talk about this with the team every now and again. And I like my goal for our team is not to not to expand the circle of knowledge of things they know. It's to expand the circle of knowledge of things they know they don't know because that stuff you can go and learn.
[00:04:18]you don't need to know all that, all that stuff. You need to know that it exists and that you can go and find it.
[00:04:23] Angela: I think you're right. And I think that's probably a good segue into what we're talking about today. Ah,
[00:04:29] Chris: It is because computer science follows a very similar learning curve. I think.
[00:04:33]Angela: As the CEO of a digital health company. I know we're about to find out about how much I know about the topic of encryption and how it is more about knowing what you don't know and either finding the right people, uh, to do it or to understand what you don't have an idea of what you don't know.

[00:04:58]What is Encryption as a High Level?

[00:04:58] Jonathan: Yeah. And so, yeah, that's the topic of today is, well, we wanted to talk about encryption, um, because clinic is, um, what's called end to end encrypted. Which practically means that only the person who sent a referral and the person who receives a referral can read or see any of that data.
[00:05:20] No one else can see that including, including us as the builders of this software. Chris, how would you characterize that encryption is discussed in terms of products and things that exist now?
[00:05:30]Chris:  encryption comes up all the time. And maybe from, from a layman's perspective, it can often seem like encryption is encryption, which I guess it's technically true, but how you're using that encryption really matters for how private your data is. Um, and it kind of fits into three broad categories.
[00:05:52] Uh, the first type of category is the most popular of encryption. The, when that, um, whether knowing it or not, you use this all the time in your day to day life, which is communication encryption. So this isn't encrypting data between two end points that are talking to each other. So a good example of this is when you open up the Facebook application and Facebook goes and fetches your profile data, or your timeline data from facebook.com.
[00:06:17] It's doing that in an encrypted way. So your internet service provider, for example, can't read that data, but Facebook can. So although there's encryption in place, there, it's not the same as other types of encryption that might protect your data from everyone, even including Facebook.
[00:06:32]The second type is encryption at rest. So this is maybe if you have a file on your computer and you've decided to encrypt that file and you've used a password to do that. Or if you're using something like Mac's operating system's encryption feature. No one can actually open up your Mac and read all the data on it without your password. So if you're using that feature, then that's kind of encryption at rest while your Mac is actually unlocked, someone could certainly come over to your computer and access all the data.
[00:07:00] But if you had your computer turned off and someone stole it and ran away, they wouldn't be able to read any of the data off the hard drive. So that's another way that is Christian has often used. And then the third way, which is. Probably the most privacy preserving, but is less common is end to end encryption.
[00:07:19] An end to end encryption is similar to what, and you're using a tool like Facebook, but it's even if Facebook, as the person passing the data around, even if they couldn't read it. So for example, when you use Facebook messenger and I send a message to someone else on Facebook messenger, That person is receiving it.
[00:07:38] And both of us are encrypted when we talk to Facebook, but Facebook in theory could read those messages. Um, Facebook does actually have an end to end encryption model. And if you were to turn that on, what it's then doing is the encryption is directly between me and whoever I'm messaging. So if I'm messaging Jonathan, that would mean that even Facebook can't read those messages because the encryption is directly between us and it's a little bit harder to set up and certainly more complicated and it makes building an application.
[00:08:05] Have a lot of interesting limitations and technical challenges and all sorts of feature problems that can come up when you, as the company, can't read the data, but that's what we've tried to do with Clinnect to protect patient privacy. Um, just because it's so important, right? So that's what we've done here is when someone sends a referral to someone else, us a Two Story Robot or Clinnect, we can't actually see that data. It's directly encrypted between the members of the sending medical practice and the members of the receiving medical practice
[00:08:35] like to be secure, like even, even like the baseline requirement is that the internet service providers should not be able to read your data. That is like the bare minimum for building a web application today.
[00:08:45] But being end to end encrypted is definitely being a lot more forward thinking.
[00:08:49]Jonathan: I have some questions. I don't think they're relevant.
[00:08:51] Chris: I love irrelevant questions.
[00:08:53]Jonathan: I was thinking like, are there, are there still cases of, of applications or services that aren't even hitting that baseline requirement
[00:09:07] Chris: I mean, they're not, they're not right. Like the, the phone line provider could in theory, uh, re read your data. Um, I mean, there's, there's some advantages in some ways in that the phone line provider, isn't, isn't storing that fax but ultimately you have to trust them when they say they're not doing that.
[00:09:23]Jonathan: one of the things about encryption is that it does it, it adds that layer of trust, or maybe, maybe the right word is you don't have to trust, right? Like a fax machine. You have to trust that the carrier, that telephone company is acting in a way that is not, um, privacy invading.
[00:09:41] encrypting that communication. So it doesn't matter. Like we don't have to trust, like the fax operation could be run by bad person company.
[00:09:50] Um, and they, they, you know, they can record all they want. It doesn't matter because they, they wouldn't be able to read it.
[00:09:56] Angela: exactly, that's it? Yeah. And I think too, um, what we're also forgetting around the fax machine privacy issue is that. You could send it to the wrong fax number because there's no verification on the other end that they are who they are.
[00:10:16] Right. So it could end up on any fax machine.
[00:10:19] Chris: Yeah, I think that that's also has an interesting corollary to, to building a web application cause the internet works a fair bit different than the phone network and it say we had built this without that end to end encryption. There's lots of interesting problems that can happen. Um, now again, I've said that kinda like encryption to the server is the bare minimum, but it also becomes like even more important when you start talking about the internet, because with the phone line connection, if I was to call you Angela, it's probably pretty likely that like how that call is going to get routed is controlled by the phone network.
[00:10:56] And it's pretty likely going to go to you where the internet doesn't quite work that way. Um, the way the internet works is through the system called BGP. Basically an ISP or an internet service provider or someone who's a big player on the internet. We'll sort of just say, Hey, I'm handling the traffic for all of these addresses.
[00:11:14] And it's very brittle. There's actually been mistakes in the past where, uh, say something, I don't remember the exact countries, but, um, say someone, an internet service provider in Brazil has said, I own all of the google.com IPS. And then everyone starts sending all of their traffic to Brazil, even if maybe they were already right beside, at Google data center.
[00:11:35] So it's also difficult to ensure how our traffic is even routed through the internet, which is why, like, of course there's people monitoring this and if you behave, you're, you're a bad player they're going to boot you out. But ultimately it's important to have that even that baseline encryption and end to encryption on top of that is even more helpful.
[00:11:54] Um, Just because the internet works so much differently.
[00:11:57]Jonathan:  we've deliberately chosen to build Clinnect in an end to end encrypted way, which is kind of the, the most encrypted, the most encrypted way we could, we could build it or is there another, like, is there an even more encrypted way that we could build this?
[00:12:13]Chris: I think everything is going to be a compromise. There's probably some things we could do that would have been more encrypted, but anything you do is going to come with a little bit of a sacrifice to user usability, right? So one, one thing we've done is when you send a referral, anyone at the receiving practice can access it.
[00:12:34] That is the doctor or their MOAs as well. We could have made it more encrypted by sending it specifically to the doctor,
[00:12:43] um, and never allowing you have to be sent to anyone else in the future ever again, and encoding it directly for the doctor's keys. If we had done that, that would arguably be more more encrypted because you're reducing the number of people with access to the unencrypted version of that file.
[00:13:01] But that would obviously come with very large considerations for the user experience. So I think ultimately with these things, it's going to be a trade off between the level of thoroughness in your encryption architecture and the user experience. And I feel like for something as important as patient data, we still have to make some product compromises, but we're right on the balance and the sweet spot where it's an effective and a usable product, and also highly secure compared to alternate approaches.

[00:13:31]An Analogy to Boxes and Locks

[00:13:31]Jonathan: when thinking about it from the user's perspective, like we always have to. We have to explain this to them sometimes and help, help guide them to why this is better, why this does protect them, them like as, as, um, practitioners and patient data. Uh, and so we've, tried to come up with analogies to explain this.
[00:13:55] So. Um, in explaining this in the past. So Chris kind of explained what we did in a very technical diagram. I tried to bake that into a different analogy and then Angela took that and also tried to explain that to some potential customers. So I'm curious to hear that replayed back to us.
[00:14:14] Angela: Oh, my, okay. So what I tell people and let's go back to the primary care provider is putting together a package. This package is a referral. So this referral package contains like every thing about this person. So highly sensitive patient data. What I say is that when you take this package it gets put into a box that is locked. But depending on how many people can open it on the

[00:14:52]Editors Note

[00:14:52] Jonathan: Okay, Jonathan here. Uh, I'm editing this and listening to Angela and myself, trying to explain encryption through an analogy and we go on and on and on about boxes and locks and putting boxes inside of boxes with locks inside of locks and boxes and boxes and locks and boxes and locks. And it's very confusing.
[00:15:13] Um, very hard to listen to it. So I'm going to save you all the trouble and we're just going to skip all that part and just suffice to say, we butchered an analogy for trying to explain encryption. It was terrible.

[00:15:27]Back to the program

[00:15:27] the receiving team gave us. And so that lock gets put on that box and that whole box gets put in another box with the key, uh, uh, damn.
[00:15:39] Chris: built this and I'm not, I'm not following.
[00:15:41]It's a good analogy. And you're, you're not. Wrong per se, but it's a struggle to use an analogy to explain the system because anytime you try and be even remotely, correct, the analogy starts to break down to the point that you might as well just teach someone cryptography.
[00:16:03] Angela: don't
[00:16:03] Jonathan: Okay. How does it work, Chris? What's this
[00:16:08] Angela: And you really don't need to use.
[00:16:10] Chris: Can I abandon the
[00:16:12] Angela: Yes, please. Please do this, the analogy. So this all started from me saying to Jonathan, like the cryptography that we've built into Clinnect is sits in the background. As a user, you have no idea actually how secure it is, but it's privacy by design.
[00:16:32] This is what we've done with Clinnect. And, um, but I wanted to showcase that I wanted a really easy analogy. Apparently there isn't one a really easy and okay. Okay. Well then, then go ahead. Yeah. I wanted to share with users, so they were like, Oh yeah. Cool.
[00:16:50] Chris: There is an easy analogy. I think that the thing is, is you have to trade off being correct. Um, both of you, I think, are trying to be like, actually correct in the explanation, in which case you might as well just talk about the cryptography. I think if you don't mind quite a bit of oversimplification an analogy is actually not too bad.
[00:17:11]Jonathan: So what's the oversimplified version of
[00:17:14] Angela: Yes, please do.
[00:17:15] Chris: The oversimplified version is I would say, imagine a lock that has two keys and one key can lock the lock and the other key can unlock the lock. Each key only turns one way, so you can only lock or unlock it. So the key that unlocks it is your secret key. It's the one that you just want to hold on. You don't want to give that to anyone else, but the one that locks it, that's fine because all it does is lock it. You can make as many copies of that, of, of that as you want and send that to as many people as you want. So when you send a referral. What you're doing is you're asking the Clinnect server, you're saying, Hey, can I have the public key and Clinnect server saying yep. Here you go. Here's what copy of that? And you use that to put all the referral data in this box and do you lock it, but you can't unlock it and neither can we, and then you give the box to us. And then when the receiving specialist logs in. We give them the box and they have the key that can unlock, which is derived from their password.
[00:18:20] And we don't know their password. So we don't know the secret key. But they have that secret key and they can use that to unlock the box. That's the core. Now of course, the parts where that's over simplifying is there's actually multiple people that can unlock this box. Everyone at the receiving specialist can unlock it.
[00:18:39] So that includes their MOAs, um, and that's, that's where things start to become complicated because what we actually do is we give keys to each user and then keys that represent the practice. And then we take the practices secret key. And we use each user's public key to then encrypt it for them so that they have their own kind of double wrapped copy of the practices key.
[00:19:02] But now you can see that now it's starting to get complicated and you can see where it breaks down. So you don't that you, that's why you have to trade off the accuracy. We could talk about asymmetric versus symmetric encryption. And, and if you could explain it, um, it's actually not too hard, but maybe maybe a bit longer than, than 30 minutes.
[00:19:20] Um, But it's honestly not quite that daunting, but I think, yeah, if you, if you want something for, for a nontechnical audience that is okay with a little bit of inaccuracy and simplification, then I like that analogy for it.
[00:19:34] Angela: Okay, Chris, so people are going to be listening and then there, you're going to peak their interest. They're going to go. Huh, but this guy's talking about is really interesting. And maybe I do want to know a little bit more, where would you point someone who let's say is like me knows very little about this, but is really interested in learning a little bit more about it.
[00:19:55] Chris: Google is a great resource. I think part of the, where the analogy breaks
[00:20:00] Jonathan: it.
[00:20:00] Angela: Just freaking Google it. God, I want to do something better that we, where we can like link in the show notes or
[00:20:07] Chris: Oh, I can link in the show notes, but if you ask me offhand, I mean, I learned most of this a decade ago, so it's a little bit challenging to put yourself in the beginner's shoes, but I could find some resources. Um, yeah, I think part of it is that the analogy, the analogy, it skips the actual names of these things, right.
[00:20:26] Which is asymmetric cryptography
[00:20:29]Jonathan:  it's it's hard to explain without explaining cryptography, how hard is it to implement? How hard is it to build this stuff?
[00:20:38] Chris: It's simultaneously easier than you would expect and harder than it should be.
[00:20:44] Angela: If that wasn't the classic Chris Foster answer, I don't know.
[00:20:49] Jonathan: I'm going to sit firmly on the fence.
[00:20:52] Chris: There's some parts, like the core concept of it feels quite simple when we approached it and we first started talking about the end to end encryption thought through some of the ideas and I thought, yeah, this, this feels pretty approachable. Um, but the devil's in the details with this thing, I think for sure.
[00:21:07]it's easy in the sense that we've leaned on a lot of existing models. With cryptography the less you can do that looks like something new, the better. So the one rule of cryptography is kind of that you should never implement your own cryptography.
[00:21:21]So we based this on a whole bunch of similar models, like the Firefox Sync architecture, as well as, um, Last Pass' security model. Basically anything we could find in existing systems that were established and have been around for years and had lots of people looking at them and were built by teams of experts.
[00:21:38] We wanted to try and copy as much as we could from those architectures. Some of the complicated bits have been that, doing this in the browser was a little bit tricky. Some of the APIs are pretty new. We've been using what's called the web crypto APIs, um, which have just reached a stage where they are appropriate to be used, but they definitely differ quite a bit between each browser.
[00:22:00] And it's pretty hard to get them to work for some things that you need in some situations. So, when we write out the whole plan feels very approachable, sensible. We're basically doing what everyone else has been doing. But then actually implementing it comes with lots of little gotchas that we had to work through. So. So I would say, yeah, I think like there's no other way to put it other than to say it is easy and hard.
[00:22:24] Jonathan: I like that answer. I like that answer. what are some other reasons why we wanted to build end to end encryption into this product?
[00:22:32]Angela: maybe I'll take you back to like when we were first talking about doing all of this, and I remember, I actually remember the day that I kind of dropped the bomb on you, Jonathan, where I said, I don't think I want Clinnect to see like anybody that works in Clinnect to see any of these actual referrals.
[00:22:48] And I remember you kind of going. Oh, okay. That changes things, you know? There was a couple of different business reasons behind this. It seemed like the most appropriate way to handle patient data.
[00:23:03] We don't need to see what's in those referrals. We don't want to see what's in those referrals. That is a hundred percent patient data that we should not be entitled to. Clinnect is a really small company right now.
[00:23:16] I mean, there's only a few of us that work there. Uh, I trust everybody that works there. I think they're amazing. Um, what if Clinnect was to balloon into a team of hundred hundreds of people and I all of a sudden had an application where you could go in and see anybody's personal health data. That's not okay in my opinion at all.
[00:23:41] It would've felt weird to add that in after the fact too. And I think a lot of the discussions that we had was, well, if this is the way that you want to do it, let's, let's build it right from the get, go that way, rather than trying to add that in later, which I think probably would have been a nightmare.
[00:23:54] Chris: Borderline impossible.
[00:23:56] Angela: Or borderline impossible. There you go. So glad we made that decision
[00:24:02] We're a startup, we're a young company. We do not know where this company is going. We know who owns it right now, but what does it look like in 10 years? And would that have changed the direction that we went to?
[00:24:15] If we had access to that data and to be honest from a social enterprise perspective, it is not the world that I want to get into with having access to personal health data and managing the risk around that.
[00:24:31]Jonathan:  We own it now. And what you're saying is that there's the potential that the Clinnect gets acquired and that acquirer could do something else with the data, even though our intention was, if we had an end to end encrypted it, like our intention was yet, we're not going to do anything nefarious with this data.
[00:24:48] Um, but now we've protected against that from happening in the
[00:24:51] Angela: in the future and I mean, that's not a protection for me or Clinnect. That's a protection for every user and every person that has their data going through us. It was a decision that I didn't make lightly that's for sure. But it also was something that it wasn't a hard decision to make either as soon as we kind of ran through a couple scenarios and I was like, Whoa, why, why are we even considering not doing this?
[00:25:16] Chris: And also even as like technical lead, like I like that, like that feels a little bit of weight off my shoulders. Um, then knowing that, that we are creating this repository that is going to be such a massive target of personal data. Now I absolutely think, especially as we continue to grow, we should treat it as if it is personal data, put all of those safeguards in place, and operational policies and treat our security with the importance that we would as if we were holding patient data.
[00:25:47] But it sure makes me feel a whole lot better knowing that, that we,
[00:25:52] Angela: exactly. Yep.
[00:25:53] Jonathan: And ultimately, like, what is the, what is the risk here? Like what is our exposure to, to somebody doing something bad? What's the worst that can be done?
[00:26:03]Chris:  if we're talking about absolute worst case scenario, is that someone could. Compromise our servers, or if there was a very malicious acquisition and replace the version of the application that comes out with one that has bad code in it, and it could wait for the user to enter their password and then start decrypting data and then push it somewhere else.
[00:26:23] Un-encrypted that's a potential risk. It's. There's practical limits on that. So for example, you would only be able to compromise individual users and the rate at which you could extract data would be much slower than if you just had a giant database of say hundreds of gigs of private data. That's, that's just a database you can download that has all the private data where this must be a targeted attack against individual users.
[00:26:46] Right. You have to set up a server to receive that data. And then you have to also store all of that data . So, so that is in theory, something that could happen, which is sometimes why end to end web applications kind of get some criticism, but is it a whole lot better than if we didn't have that stuff encrypted?
[00:27:02] Absolutely. so I would say that there's still, there's maybe targeted attacks that could in theory be at risk, but. Again, that's why our responsibility should be to still treat the security of the application as if it was personal data. And I would say that certainly from a, hacker's perspective, I wouldn't say that that that's a, that's a small feat to pull off that sort of attack.
[00:27:23] Um, it's definitely far more complex than, than some of the other than say, just like getting access to a database and downloading all of the data. Um, it's definitely quite a bit more complex, but.
[00:27:35]Angela: when you talk about a targeted attack on Clinnect, it would be relatively unfruitful. Cause it would take a long time, whereas there's a lot of other low hanging fruit targets. And so even that alone, right. Is decreasing risk there too.
[00:27:51] Jonathan: Yeah, we make, we make ourselves look less attractive than another
[00:27:56] than another potential target. Like, I mean, and, and that, that has happened already in, in our world. Like the, the life LifeLabs was hacked and breached, and I don't know how many, how many patient records were exposed, but.
[00:28:11] Angela: I can't remember. We can take a look and we'll put it in the show notes, um, link an article to it, but it was, it was a significant amount. I mean, I was one of the people that received, uh, a notification that. That my stuff had, had potentially been
[00:28:31]Chris:  ultimately nothing is a silver bullet, right? Um, I think also one of the other things is that cryptography is not a replacement for user education. Um, the users are certainly probably the more likely weak point, uh, would be someone attacking an individual user's machine or even trying to social engineer them.
[00:28:49] Um, which is say, for example, calling them up and pretending to be Clinnect staff or emailing them and saying that they need their password. Um, those sorts of things that, that our user might fall for are probably the most likely risk
[00:29:03] Angela: Yep. Yeah, a little PSA do not give your password over the phone to anybody
[00:29:12] Jonathan: Ever ever
[00:29:13] Angela: ever don't do it. People

[00:29:17]Recommendations for building an End-to-End encrypted app

[00:29:17]Jonathan:  if someone wanted to build an end to end encrypted app, do you have any recommendations?
[00:29:23]Chris: like we said, the core of it is pretty easy, but the hard bits are the hard bits. I think something that we already touched on, which is of course the first rule of cryptography is that make sure you, you feel confident in what you're doing and familiar and like, make sure you have some sort of expertise in these systems and don't ever create your own cryptography. Um, yeah, you want to, you want to always lean on, on what experts have done. So, so yeah, I would always say that like, if you are working with sensitive data and your goal is to build an end to end encrypted app and make sure that that you're not doing anything new.
[00:30:03] Angela: I like that. I actually feel like you're demystifying. Um, the work that you're doing a little bit with the average, like. General population listening is I think we commonly think that you build everything from scratch, but that's not the case. And as you mentioned it's, and in this case, it shouldn't be the case.
[00:30:25] Chris: Yeah. Yeah. There's absolutely some, some small adaptions. Like I said, that we've, we've kind of made like the Firefox Sync architecture or Last Pass are different products ultimately than Clinnect. So there's, there's some small adaptions, but ultimately, the architecture is basically really heavily leaning on what people have already done and then the encryption themselves, or the encryption itself, the act of actually encrypting the data.
[00:30:47] Um, we wrote none of that code. That's all handled by the browsers through the web crypto API. So yeah, we, um, it's, it's not quite as simple, but in essence we say like, Hey encrypt this, and that's, that's the extent of what we've implemented for encryption. So the browsers handle all of that portion. Um, and if we had say implemented that ourselves, it just, it opens up so many doors for something potentially going wrong.
[00:31:11] So, um, in some respect, it is, it is better to take the easier route.
[00:31:17]Jonathan: A two story robot. We take the easy path.
[00:31:23] Angela: It's all hard and simple at the same time.
[00:31:27] Chris: and it's not the easy route in some respect, too, right? Like the easy route would be no end to end encryption that's easiest.
[00:31:32] Angela: that's actually a really good point, Chris is that we could have done this , without any of this and law doesn't require us to do what we are doing. We are taking the extra, additional step and protecting patients and users. Been an interesting journey for me because I originally just thought, well, I just don't want to see any of it. And if we could build it like that, that would be great. And I
[00:31:58] had no
[00:31:59] Jonathan: an off the cuff
[00:32:01] you just
[00:32:02] Angela: off the cuff,
[00:32:03] but it was a thought out decision, but it certainly wasn't thought out to the point of what does this mean from a development perspective at all?
[00:32:13] I didn't know what I was getting our team into. So
[00:32:15] Chris: Yeah, absolutely. That's and that's a fair point. The, uh, the non-encrypted end to end version of this application is a much smaller application. That is, would it have been much faster to put together? Um, but I mean, yeah, we, we also, we don't know of any other provider doing something like this for medical referrals.
[00:32:34] So it's it's because patient privacy is so important that, that we wanted to ensure we took the time to think about the system and make sure we got it right. So.
[00:32:42]Jonathan:  taking the time to get things right. Uh, Chris, where can people find you and follow you? If they're interested in.
[00:32:49] Chris: Um, I have a Twitter account and a blog with a mailing list. If you're interested in more technical details on stuff like cryptography or artificial intelligence, um, if you Google chrisfosterelli, it comes up with all of my profiles. Don't Google, just Chris Foster. I'm not the most popular Chris Foster, but.
[00:33:07]Jonathan: how many more years until you're the most popular?
[00:33:10] Chris: Oh, is that a goal? Do I have to commit to that?
[00:33:13] Angela: Yeah, Yeah, you do.
[00:33:14] Chris: Yeah. Decade 10 years.
[00:33:17] Outro
Angela:
Thanks for listening to Fixing Faxes, building a digital health startup. I'm Angela Hapke and my cohost is Jonathan Bowers. Our guest today was Chris Foster. Our music is by Andrew Codeman. Follow us on Twitter @FixingFaxes. You can find us wherever you like to listen to podcasts. And please do us a favor and tell a friend. Thanks for listening.
[00:33:41]Jonathan: I wonder, I wonder if the memory foam topper is like the pop filter
[00:33:46] of camping.
[00:33:47]Angela: Maybe takes that edge off
[00:33:49]Chris: My camping tent barely has enough room to sit up. So I feel like I am the laptop mic of camping.

© 2020, Central Referral Solutions