NILAY PATEL: You left, Amazon said we’re going to stop working with police, you came back, boy, Ring is going to work with police again. You have a partnership with Axon, which makes the taser, that allows law enforcement to get access to Ring footage. Did that feel like a two-way door? They made the wrong decision in your absence, and you came back and said, “We’re going to do this again”?
JAMIE SIMINOFF: I don’t know if it’s wrong or right, but I think different leadership does different things. I do believe that I spent a lot of time going on ride-alongs. I spent a lot of time in areas that I’d say are not safe for those people, and I’ve seen a lot of things where I think we can positively impact them. So, we don’t work with police in the way of ... I just want to be careful, as we’re not ... What we do allow is for agencies to ask for footage when something happens. We allow our neighbors, which I’ll say in this point are our customers, just to be clear, we allow our customers to anonymously decide whether or not they want to partake in that.
So, if they decide they don’t want to be part of this network and don’t want to help this public service agency that asks them, they just say no. If they decide that they do want to, which, by the way, a lot of people want to increase the security of their neighborhoods. A lot of people want their kids to grow up in safer neighborhoods, a lot of people want to have the tools to do that, and are in places that are dangerous. We give them the ability to say yes and make it more efficient for them to communicate with those public service agencies, and also do it in a very auditable digital format.
That’s the other side. Today, without these tools, if a police officer wanted to go and get footage from something, they’d have to go and knock on the door and ask you, and that’s not comfortable for anyone. There’s no digital audit trail of it, and, with this, they can do it efficiently with an audit trail. It is very clear, and it’s anonymous.
JAIME SIMINOFF: But when you put AI into it, now, all of a sudden, you have this human element that AI gives you. I think, with our products in neighborhoods and, again, you have to be a little bit specific to it, I do see a path where we can actually start to take down crime in a neighborhood to call it close to zero. And I even said, there are some crimes that you can’t stop, of course.
NILAY PATEL: Mechanically, walk people through what you mean. You put enough Ring products in a neighborhood, and then AI does what to them that helps you get closer to the mission of zeroing out crime?
So, the mental model, or how I look at it, is that AI allows us to have ... If you had a neighborhood where you had unlimited resources, so every house had security guards and those security guards were people that worked the same house for 10 years or 20 years, and I mean that from a knowledge perspective. So, the knowledge they had of that house was extreme; they knew everything about you and that residence and your family, how you lived, the people that came in and out.
And then, if that neighborhood had an HOA with, call it private security, and those private security were also around and knew everything, what would happen? When a dog gets lost, you’d be like, “Oh, my gosh, my dog is lost.” Well, they would call each other, and one of them would find the dog very quickly. So, how do we change that and bring that into the digital world is—
Can I just ask you a question about that neighborhood specifically?
Sure.
Do you ever stop and consider that that neighborhood might suck? Just the idea that every house on my street would have all-knowing private security guards, and I would have an HOA, and that HOA would have a private security force.
You can easily paint that as dystopia. Everyone’s so afraid that we have private cops on every corner, and I’m paying HOA fees, which is just a nightmare of its own.
So, I would assume you live in a safe neighborhood.
I hope so, yeah.
No, today, I’d go to ... If you want, I’ll take you to a place where people live and have to, when they get home from school, lock their doors and stay in their house, and they can’t go out and—
But I’m just saying that that model is “everybody is so afraid that they have private cops.”
I think the model is that doing crime in a neighborhood like that is not profitable, and I think that you want people to move into another job. I don’t think that crime is a good thing and so I think ... But listen, it certainly is an argument to have, I do believe that ... I think safer neighborhoods allow for kids to grow up in a better environment and I think that allows them to be able to focus on the things that matter and so that’s what we’re going for.
I just wanted to challenge the premise.
I think it’s a fair challenge.
The model is that there are cops everywhere. That level of privacy.
Yeah, it’s not cops. I think it’s more that you’ll have the ability to understand what’s happening. It’s not like ... But yeah, I think, listen, it’s a fair statement, I guess. I think I want to live in a safe place.
There’s a lot of intelligence in your neighborhood, and maybe it’s private security, maybe it’s not. What does the AI do? Does it just make the camera smarter? It lets you do a more intelligent assessment of what the cameras are seeing?
Right now, we just say motion detection, motion detection, motion detection. It’s funny, when I started Ring… The book was fun because I got to go back and actually go through this whole story of how this thing came to be, and motion detection was an amazing invention. You’re in the airport, and there’s a motion at your front door, and you look at it like, “Wow, this is crazy.”
Now, with AI, we shouldn’t be telling you about motion detection; we should be telling you what’s there, when you should look at it, when it matters, and we shouldn’t be bothering you all the time. That’s what I mean by this idea of these security guards at your house or in your neighborhood. There should be this intelligence in your neighborhood that can tell you when you should be trying to be part of something, but not always tell you. So, it’s not just like, “Car, car, dog, person, person.” It’s like, “Hey, look at this. You want to pay attention to this right now.”
NILAY PATEL: Do you think when you talk about zero out crime in a neighborhood, the idea that everyone in a neighborhood has one of those illuminated Ring signs in the front yard, is that enough to—
JAMIE SIMINOFF: It’s a part of it.
Is that just enough of a deterrent? The bad guys will know their face is going to be captured on video, and that will be analyzed by an AI, and something will happen. Do you have to do more outbound deterrents?
I think that’s a part of it. Awareness is a big part of it. I think there are ways with lights also, using lighting to do stuff, that’s a big part of it. I think having just ... If, all of a sudden, someone comes outside because something’s an anomaly, that’s a big part of it. It doesn’t have to be some crazy thing. And that’s what I was saying, is a lot of these little things add up to make that work.
So, when you think about it, okay, we can bring crime down in a neighborhood to close to zero in a neighborhood, what are the ratcheting steps? Does everyone just get the Ring camera, and your platform does all the work? Is it that someone gets caught and they tell all their friends in jail that they got caught? What are the steps?
I think it’s really about bringing neighbors together for this particular thing. So, it’s about how you individually… and we’ve always thought about how each house is its own node controlled by the neighbors, so controlled by the person, and I’ll keep going back to that, which is one hundred percent, your video is in your control; everything you’re doing is in your control, whether you want to take part in anything is in your control. That has to be the first layer of all of it.
But then, when something happens, do you want to take part in it? So, if you get an alert that this dog looks like the dog that’s in front of your house, can you contact your neighbor? You can decide not to take part in it, and then no one will ever know, and it’s fine, it’s just basically deleted, or you can take part in it. I think that’s how we can do things that can make a neighborhood into this node where individual neighbors are all on their own, but when things happen, they can work together as they want to.
And you think that AI will accelerate the process?
I think AI is a co-pilot. It is their assistant, and it’s helping them to figure this out. Because, again, if you’re just getting every motion alert, and if you have eight cameras and you’re just getting motion alerts all day, no human being can parse all this data. So that’s what I was talking to Jen about, is that I do think I see a way to use AI to help feed better data to us, which allows us to make better decisions and work together better.
NILAY PATEL: But when you connect a bunch of those databases, particularly to facial recognition, there’s a turn in the privacy conversation where the stakes ratchet up really high, where maybe it’s gone forever.
How are you thinking about that decision-making? Okay, we have a lot of intelligence in the AI; it’s trivial for the AI to connect to another store of information. That’s a thing you can do with AI, especially at a big company like Amazon, where you have lots of other stores of information. There’s a line, what’s the line for you?
JAMIE SIMINOFF: There is a responsibility, obviously just to build safe products. So let’s just start with that. Yeah, we did announce facial, we call it Familiar Faces, but that’s not connected, that’s just for your... Your iPhone today. If you search your iPhone, it’s crazy. Search for someone’s name in your photos, and their pictures come up.
So I do think there’s a balance between not allowing technology to exist that should exist that helps people and gives them more efficiency, gives them safer homes and then also, obviously, not creating this dystopian place. And so, I think that’s the responsibility, but what we’re doing with Familiar Faces is we’re just giving you the ability to say, when my wife comes home, don’t... Because it is silly. Why do I get an alert when my wife comes home? I don’t want it, I don’t need it.
I’m asking this for a lot of reasons, but I look at what’s broadly happening with surveillance footage out in the world. And I’m not saying Ring is participating in this, I’m just giving you an example. ICE has facial recognition systems, and they are arguing that a positive match in their facial recognition system is a definitive determination of someone’s immigration status. That’s way out there. I don’t think you’re doing that.
But you can get to, “Okay, we have facial recognition, we have a bunch of evidence coming off of Ring cameras, to make it really safe, you want to go from passive surveillance to active surveillance. That’s what the studies show. Now the camera will literally identify the criminal by face and tell the cops this person tried to steal a car from this driveway,” and that’s the thing that would get you to actually zero out crime.
There’s a lot of risk in those steps. But if I draw the thread from what you’re saying, it’s all the way to the idea that the criminals won’t come here because the cameras will know who they are and tell the cops. Are you willing to go that far?
I think it’s also that the cameras will alert people. Part of what made Ring and what made neighbors safer with Ring 1.0, and I think we are in Ring 2.0, is that there was no presence at the home. How did people break into homes? They would go and be knock-knock burglars. They would knock-knock, no one was home. It was 3PM, they’d go to the homes next door, find a place that was empty, and they’d go into the home.
Ring allowed you to, now, all of a sudden, when someone comes up to the door, you’re like, “Oh, I got a motion alert. Hi, what’s going on?” and so it gave a presence to the home. So, I don’t think you have to go as far as that real time stuff to get to where we’re talking about, I think it’s more of the anomaly detection and allowing people to make it so that, if someone comes in, that you’re aware of what’s happening around the neighborhood because right now there’s no awareness of what’s going on around it.So I don’t think it’s as dystopian as where you’re going, and certainly it’s not what we’re building, and I do think we can impact things to a really high level in neighborhoods. Which, again, to the Jen Tuohy thing, in neighborhoods is what we were talking about, that with AI and what we’re doing with a bunch of Rings together. I think even the Dog Search Party is a good way to look at it, which is how these cameras come together for good in the neighborhood.
NILAY PATEL: Presuming we have to have an authenticated server, there’s a crime in my neighborhood, and I’ve opted in, and we’re going to say the cops can only get the video from the Ring server, where we know it’s true. I might not be as in control of my video anymore.
JAMIE SIMINOFF: No, not how it’s built and not while I’m here because the way it works is that you will decide if you want to or not want to share that video, which is your property, with someone. Now, once you share it, then it is up to us to figure out, to your point, how do we share it, how do we make sure that the digital fingerprint goes all the way through, or how does the chain of custody work of this video to make sure there’s no fake in the process of it? I think this is why it is important to build these systems.
It’s going to be important, though. This is also where the government is going to have to step in. We’re going to have to deal with this across the board because we also have video coming off of cell phones. So, we do need to figure out how to build... And there’s going to be companies, Axon would probably be one of the companies. I don’t want to speak for them, but they have evidence.com, so to build these evidentiary systems to take in…
Because Ring is one part of taking in data around, call it a crime scene, but cell phone video is maybe even more of a source today. So, how do you take that in? How do you make sure that it actually was captured on the iPhone directly and not tampered with between the two things? We’re going to have to figure it all out. I think we have to work together on it, and the AI stuff is pushing us to do it. I am proud that with Ring, we have built it so that you can take it directly and keep it on the server. You can understand where it was, where it’s from, where it was created, and we have that digital fingerprint on it and the audit trail of it.
You’re going to have to do that more and more as this world is changing, you’re just not going to be able to trust that just because someone sends you a video doesn’t mean it’s true.
Adam Mosseri testifies that Instagram isn’t “clinically addictive.” The internet disagrees