800+ Venture-Funded Startups Across the USA Trust Burkland

Data Privacy Considerations that Startups Can’t Ignore

What do VCs learn from your privacy practices? Katharine Tomko explains why it matters and shares a founder-friendly roadmap to data protection.

In this episode of Startup Success, we explore the data privacy considerations that all founders need to address, not ignore, with Katharine Tomko, Partner at First Ascent Ventures. Katharine fully understands all the priorities and stresses founders face, so she lays out a clear roadmap for them with small, actionable steps they can take now that will help them down the road with privacy.

With an intriguing background that spans from her first job as a parole officer to the early days of data protection programs at Facebook and now her work in venture capital, Katharine is a true authority on data privacy. She “gets” the mindset of founders and knows what they need to do today around privacy and data collection.

Katharine offers perspective on:

  • The big dangers of not prioritizing privacy in the AI era
  • Why more is not better when it comes to collecting customer data
  • Does data mapping solve privacy?
  • What VCs can learn about a founder by how they manage privacy

Founders and VCs – don’t miss this deep dive into a topic that’s becoming non-negotiable in the startup world!

This discussion with Katharine Tomko of First Ascent Ventures comes from our show Startup Success. Browse all Burkland podcasts and subscribe to the show on Apple podcasts.

Episode Transcript

Intro 00:01
Welcome to Startup Success, the podcast for startup founders and investors. Here, you’ll find stories of success from others in the trenches as they work to scale some of the fastest growing startups in the world, stories that will help you in your own journey. Startup Success starts now.

Kate 00:17
All right. Welcome to Startup Success today, we have Katharine Tomko in studio, who is a partner at First Ascent Ventures. Welcome Katharine.

Katharine Tomko 00:30
Thank you. Really happy to be here.

Kate 00:32
I’m looking forward to speaking with you because, and I want to get into First Ascent Ventures, but first if you could just walk us through your background, because you have one of the coolest, most interesting backgrounds I’ve seen.

Katharine Tomko 00:47
Yeah, where, where do you want me to start?

Kate 00:50
I mean, maybe, like college on, like, your career trajectory, because it’s pretty neat.

Katharine Tomko 00:49
Yeah, it was very much a jungle gym. I went to a pretty small university outside of Toronto called the University of Guelph. My desire was to be a drama major, but then when I got there, I realized all the drama students were a little bit even too weird for me, so I quickly switched my major and ended up doing a double major in both Criminology and Political Science, which actually has been incredibly helpful to my career. So when I came out of university, I ended up getting a job as a Probation and Parole Officer in Kitchener Waterloo. Folks I’m sure know Waterloo, it’s like a hotbed of tech success. And found the job incredibly fascinating, but for someone who has more of a social work background in the sense of they genuinely wanted to help people, it’s not quite the role, because it’s very corrections officer focused. But what I will say is, what I took from that job is you spent a ton of time doing risk assessments on people, because ultimately, you wanted to determine how risky individuals were to society when they either came out of prison or when they were put on probation. So I got very good at, like, assessing risks, which ultimately really helped my role in privacy down the road. So the political science aspect that I focused on in university, even though it was like the early 2000s and we really don’t, did not think about privacy in the way that we think about it today, was around privacy. I did my thesis on biometric encryption, whether it was going to be a tool of surveillance or like a technology of freedom. I really focused on what happens to individuals when you remove privacy. And so when I left the world of probation and parole, I started looking for roles that touched on privacy. Again, there really were not a lot people thinking about privacy in the way that we think about it today. I ended up, through happenstance, connecting with somebody who worked at Facebook, I was still based in Toronto, and ended up sort of having quite a frank conversation with him around my thoughts on Facebook and privacy, and really thinking through the success of the business from the lens of privacy. Ended up getting hired at Facebook back in 2008. Again at the time, just wasn’t a lot of privacy roles, so I was really fortunate to do a lot of other things at Facebook, and then ultimately, when they were hit with the consent order in 2011 I believe was when I really actually got to focus more on privacy. And one of the main things that you had to do in that role was risk assessments. And so I kind of took the experience I had of doing risk assessments on humans and applying it to data. (Wow. Talk about full circle.) Yeah. Full totally. Full circle. And then spent a decade at Facebook, and then left in 2017.

Kate 03:54
Wow. And so for that decade, you said you transitioned into privacy and risk assessment. How many years did you end up working there on privacy?

Katharine Tomko 04:02
I guess it would have been six years. Wow. We got through a couple of what I would consider pretty successful audits through the FTC. It was a challenging job, though, for sure, because when we were figuring out how to build out the privacy program, we didn’t have a lot of guardrails. There was no GDPR, there was no CCPA. We were kind of making it up as we went along with sort of the idea of privacy by design has come out. We kind of theoretically understand the concepts of transparency and competency, etc. But when you’ve already had a company with a business model that had been growing for the number of years that Facebook’s business model had been growing, it was definitely challenging to come in and put some guardrails around.

Kate 04:54
Did, oh, I can imagine. I mean, you were there at such an early time, right before it was, like, the hot topic.

Katharine Tomko 05:00
Yeah. And then everything just kind of evolved from there. Like, I really do think Facebook is the company that sort of pushed a lot of the privacy regulations to come to fruition.

Kate 05:13
Oh, that’s cool. So then you’re there for 10 years. How do you transition into VC investing? Because that’s a big change.

Katharine Tomko 05:24
Yeah. So for me, I genuinely still believe that privacy was a critical pillar to the success of a business. What I learned from Facebook is, when you try to build it on afterwards, it is really, really challenging. Re-architecting, like the back end infra, sort of changing the policies and procedures that companies have been doing for a long time. So I was like, Well, I’m going to go out and consult with startups, and I’m going to help them figure out their data strategy, where data is coming from, how they’re processing it internally, what rights they have. Are they being transparent with end users, etc. And the feedback that I got from founders were they understood theoretically how important privacy was. It’s like, rare that somebody’s going to be like, Oh, I don’t care about privacy at all. The challenge was, is that they said, you know, our board members and our investors don’t talk about this. They don’t push us on privacy. And ultimately, we are kind of at the helm of, like, kind of what our board members and our investors want us to do at this stage. So I thought, Well, if that’s the case, I’m going to go and become an investor and a board member and see if perhaps we can slightly move the needle around how folks think about privacy from an investment perspective, in the sense of we’re not going to write a check unless we can do light diligence on the state, whatever stage of company you’re at, because there’s different expectations according to the stage, to see if, like, are you actually thinking about this? Because we knew the risks down the road if you are not thinking about privacy, especially now, with the uptick of AI was gonna just be, like, very damaging to the business and ultimately, potentially, society.

Kate 07:14
That’s so commendable. You said a couple things that really struck me. One is that you saw firsthand what happens when you try to build out privacy after, right, and then how you wanted to work with startups. But then the second thing you said was the feedback you got from founders that investors and board members weren’t putting a priority on that, and we still see that today, so I want to get into that later, but good for you for channeling that into a career and investing. And so is that the motivation behind First Ascent? Was that your first stop in the investing play? Or is there somewhere else?

Katharine Tomko 07:52
Yeah, so when I was thinking about getting into venture capital, I really had this very solid privacy thesis in mind, and so I had connected with a number of VCs and sort of pitched my idea. Now, First Ascent Ventures is a fund that is based out of Toronto. I am originally from Toronto, although I’m based in the Bay Area. And when I met with the founders of First Ascent, Richard and Tony, they were very receptive to this idea, more so than any other folks that I had connected with. And now this was back in 2019, they really understood that potentially this was going to be table stakes, and let’s bring this into sort of the pillars of how we do diligence, how we think through supporting companies, etc. And just because of the fact that I was like, well, this will be great. I’ll be able to spend more time in Toronto and back home with my family, it was kind of a perfect fit. Unfortunately, COVID hit, and so that idea went out the window because there was no traveling, but it was kind of like a great fit for me.

Kate 08:58
That says a lot about First Ascent. So tell us about the fund.

Katharine Tomko 09:02
Yes. So we are an enterprise software fund out of Toronto. We do solely enterprise software. Series A, some series B. We are $125 million assets under management. We’re in our second fund, kind of the tail end of our second fund, so we probably have a couple of more investments to do. Industry agnostic. I don’t know if I mentioned that already. Are we completely different than a lot of the sort of enterprise software funds? No. We look for high ACVs, low churn, mission critical. Mission critical, even more so now given the tightening budgets. But I will say, I think our differentiation is kind of the lens of privacy that we bring in.

Kate 09:53
Yeah. Wow. And are you? Did you help spearhead that, the firm? (Yeah). Wow, that’s incredible.

Katharine Tomko 10:03
It’s been good because it’s been a great way of, I think part of the challenge of investing in venture capital is really assessing the founders. And really saying, okay, like, how do we really get comfortable with whether or not these founders are ethical, they’re going to build a great business, etc. I think privacy, yes, is a great way to say, are we investing in something that is very risky down the road in the sense that they’re basically hoovering up data that they should not be hoovering up, selling it to third parties, that they should not be selling it to, not have any protections in place that increase the risk of data breaches. But the other thing I really like about it is it gives a very good sense of where the founders or the CEOs mindset is with respect to how they think about ethics, and that’s a big thing now, especially with AI, everybody’s talking about, Is there some sort of ethical AI? Are they thinking about transparency, etc. So kind of light, privacy diligence we do helps us in a number of areas.

Kate 11:12
Gosh, I have been doing this show for so long and I have yet to hear VC talk about privacy. I mean, now they’re starting to talk about ethics. So it’s so funny, the way you were so ahead of this curve and how you’re right, it is a kind of a good roadmap for how, kind of the ethical challenges and decisions would be handled by a founder.

Katharine Tomko 11:35
Yeah. And I think, I mean, I’ve been involved in a lot of sort of initiatives around responsible AI. That’s challenging. I think if you take one step back and say, well, let’s just assess how they think about data, how they think about their customers, how they think about their customers’ data. You’re really going to have a good barometer around how they’re going to think about leveraging models, feeding models with whatever data, etc.

Kate 12:02
Absolutely, it’s a great framework. You’re right. So I want to get into, a little later in the show, kind of your advice to the founders listening around privacy. But before we go there, I want to ask you about Alpha Network. Because you’re involved. It’s, you know, there’s a lot of buzz around it, especially here in the Bay Area. Got a lot of first time founders listening. If you wouldn’t mind sharing, that would be great.

Katharine Tomko 12:31
Yeah. So the Alpha Network was founded, I believe in 2005 and it’s invite only, and they bring together founders, tech leaders, VCs, in this amazing, somewhat intimate environment where they host events on all sorts of topics, literally, like everything from cyber security to how to build a go-to-market strategy, et cetera. And they bring these folks together in these like dinners or small group settings, give us a topic, and then sort of allow folks to go and in a very confidential sort of Chatham House Rules setting, discuss the topic. And it really allows people to be very candid about the challenges that they’re experiencing, really asking sort of leaders in these various fields honest questions about how to think through things. And I will say, so I’m part of the Silicon Valley advisory board, and it really is one of the most wonderful networks I’ve ever been a part of.

Kate 13:38
I’ve heard such good things about it. Have you had some pretty incredible conversations with that group around privacy? I mean, they must kind of lean on you for that in that area, right?

Katharine Tomko 13:49
Yeah, we’ve done some events and panels at the various conferences talking about privacy. And yes, people are very receptive to it. People want, I am getting the sense that people really want to do right by privacy. It’s kind of like you can’t not do it anymore. But privacy is such a…it’s an ideology, and it’s very hard to sort of operationalize an ideology. So I think folks are just really looking for information. What do we do first? How do we do a risk assessment, etc, etc.

Kate 14:27
That makes sense. So let’s go there for the founders listening. I mean, how do you get started? Because I talked to so many founders, and they’re very passionate and knowledgeable about what they’re solving for. But when it comes to privacy, they want it, but it’s like, how to go there?

Katharine Tomko 14:46
Yeah. I mean, I think, I think the first thing is, what is your business model and how do you expect to make money? Because if your business model and how you expect to make money is leveraging people’s data, which is pretty much everybody’s business model, now, then you have to get very clear on where is the data coming from, what rights do you have to use the data? And this is where it gets a little complicated, because we do have a lot of state by state regulations. We have the GDPR out of Europe, we have privacy regulations out of Canada, and they’re all slightly different. So they’re all slightly different in the sense of, does consent work? Does going to like the end user and saying, Hey, this is how we’re going to use your data? Are you okay with it? Or sort of leveraging in other ways. And I think the challenge, especially now, and probably why founders are more focused on it, is because of AI, and that is because there is some uncertainty around whether or not we can currently leverage existing privacy regulations to build AI models. In the sense of, if you are getting consent from an individual on how you are going to use their data, you need to be very clear about the use cases – why you are using the data. If you feed it into an AI model, you don’t really know, potentially, the use cases of that. What the output will ultimately mean. So I think there’s a lot of confusion around that right now.

Kate 16:29
Wow, I’ve never heard it kind of explained that way, and it makes a lot of sense. I mean, you don’t know all the use cases. I don’t even know if the founder…

Katharine Tomko 16:39
That’s the huge challenge around it right now. So then you say, Okay, well, let’s recommend: do not put personally identifiable information into your models. Full stop. Don’t do it. Okay, fine. Well, then you have to go to the founder and say, Well, what do you determine as personally identifiable information, name and address? Okay, fine, don’t put that in the model. But because of the way AI works now is there’s a lot of behavioral characteristics in the sense of, you don’t have to put Katharine Tomko her name plus her address, but you could probably infer based on my behavior Oh, this is Katharine Tomko, even if you don’t put necessarily traditional PII into the model. So we’re in a very gray, gray zone right now.

Kate 17:23
So to cover kind of the basics there, what else should a founder do when getting started?

Katharine Tomko 17:30
Well, I mean, I think right now there’s a lot of talk around data mapping, and there’s a lot of tools. There’s a lot of external tools that are coming out to help map your data, which I think is pretty important. I’ve heard the analogy used before in the sense of, if you have a very messy room, you’re not going to really know where your valuables are. So they kind of like, like, make that akin to you better understand where your data is, where it lives, what it gets joined with, what the data flows, how it comes into your systems, who uses it, what the purposes are they use it, or how it goes out. And you kind of need to understand that in order to make good decisions about how to ultimately leverage it. I know a lot of companies try to do this manually, internally. It’s a tough problem. I think for startups who are just starting to ingest data, if you have good data practices from the beginning, probably easier to do. For companies that exist today, bringing in data mapping solutions or trying to do it internally, is a little bit hairy, because you’re not 100% sure.

Kate 18:40
Yeah, it sounds like there could be some confusion. So you’re recommending you start this from day one, as you’re building out. Like you have an emphasis on this.

Katharine Tomko 18:56
Oh, 100% but I am the person who has been touting this for over 20 years. But I also very much understand, from the founders perspective, in the sense of we have limited budget, very limited resource, we want to build, we want to understand product-market-fit – the last thing on our mind is privacy. I really do understand that as well. So it is sort of striking that balance, and I think that can be done. So it’s ultimately like, understand your data. There’s a whole concept around data minimization in the sense of, do you really need all of this data? Because if you think about it being sort of a honey pot for cyber security breaches or ultimately getting you in trouble down the road, like, perhaps don’t collect it all or collect it and delete it.

Kate 19:51
Wow, that’s really interesting. You said two things there. First, I think a lot of founders listening would appreciate how you, you know, mentioned finding product-market-fit and right like that’s where a lot of their attention lies and the expense, but then not keeping all the data. I feel like everyone right now just amasses all the data they can because they think at some point it might be worthwhile. Do you see that?

Katharine Tomko 20:19
Oh, absolutely. It’s because data is the new oil or the new gold or whatever people want to say. They’re just like, Wow, look at this trove of data, and I can monetize off of it in so many different ways. The challenge, though, is that we are starting to see much more sophisticated, hostile AI. And if you look at nation states or individuals who really want to sort of impact the US or whichever country, they are also going, they know that data is the goal. They are also going after data. So just in terms of protecting your company, yourself, your users, it is important to sort of think through, do we need all of this data? For the most part, you don’t.

Kate 21:08
Wow, that’s great advice. And so different from what you hear a lot, right, but it makes sense. We’re actually coming up on time. So I want to see, is there anything else like you want to mention around privacy for founders listening? You’ve shared a lot of great tidbits. I mean it. You really helped with just kind of the tactical early stages around privacy, which I appreciate.

Katharine Tomko 21:33
I think, my biggest piece of advice, perhaps, and for me and work, working in sort of the privacy world for as many years as I have done it, there are so many complicated frameworks, regulations, expectations in the end, put yourself in the shoes of the end user. Like, how would you feel about your information being used in the way that you are using it? And I understand there are certain people who are like, Oh, I have nothing to hide. I have nothing to hide. I don’t care. That is not most people, because if you actually peel back the layers of that, it’s actually not a great thing to say. You should care about how your data is being used, especially in this digital age that is moving more towards this AI age where we’re the sophistication of these models is going to be such that we could have 24 hour surveillance. And is your company potentially contributing to the advancement of that?

Kate 22:38
I love that advice because I’ve heard so many founders come from a place of regret, right with data? Or they’ll frame it like now that I have a child, right, and they’re coming of age during this time with data, they even have more regrets. So I think, yeah, you or a child or a family member, absolutely.

Katharine Tomko 22:59
Well, exactly. I mean, if you’re working in the world of tech and you’re building a company, then you understand kind of the direction that we are going. So do think about your children and your grandchildren.

Kate 23:12
Well, said. This was so fascinating Katharine, because we’ve never really addressed privacy on this show, so like in this way. So thank you so much for those people listening that want to learn more about First Ascent Ventures. Where do they go?

Katharine Tomko 23:28
Just go to first firstascent.vc and you’ll be able to find all the information about us.

Kate 23:38
Awesome. I love it. I love the emphasis on privacy, too. Thank you so much for being here. It was really fun.

Katharine Tomko 23:44
Thank you. It was so fun.

Intro 23:48
You’ve been listening to Startup Success to make sure you don’t miss out on future episodes, subscribe to the show and your favorite podcast player. Like what you hear? Tap the number of stars you think the show deserves in Apple Podcasts. For more tools and resources for your own startup success, check out burklandassociates.com. Thank you so much for listening. Until next time you.