Versions of the original Echo made before 2017 left debugging contacts available to anyone who wants to get to them. But if that happens, you already have bigger issues.

There are a couple ways to think of security when it comes to our connected devices. One is binary. Either they're safe and secure and not being actively exploited, or they're not.

I take a different tack. I look at my connected devices and assume someone can see or hear me through them and work backward from there when it comes to publicly announced exploits and hacks. How is the hack achieved? Does it require physical access to the device? Do I have to do something to it first, like install an app from a specious source? Is it something a little scarier, like the recent Broadcom vulnerability? And what's the history of the hardware manufacturer when it comes to updates?

Important things, all. And something to keep in mind when we look at the recent disclosure of a "hack" of the Amazon Echo, as detailed in Wired. Specifically, we're talking about the 2015 and 2016 models. So if you've bought one this year, you should be OK.

Unless you're an active target of a hacker, requiring physical access to a device generally means you'll be OK.

The short version is this: Those earlier Echo models were manufactured in a way in which someone could physically attach a little extra hardware to the Echo (a bootable SD card, actually), hidden out of sight under the rubber footing. This would let them listen in on what was being said, record it, and fire it off anywhere the hacker pleased. (That's in addition to other nastiness.)

There are a few things to keep in mind here, and it's something that the write-up of the exploit rightly considers.

First, the hacker would need physical access to your Echo. And if you're already an active target and someone's able to get into your home, you've got much bigger issues than Alexa listening in. (Like, say, planting a real bug somewhere else. Or multiple somewheres else.)

Second: The hacker would need physical access to your Echo. This isn't just a software thing. It's worth mentioning twice.

That's not to say there aren't scenarios in which I might worry a little more, however. The original write-up also mentions that the larger (yet still theoretical, as this is all part of a proof-of-concept thing) issue could be in places like hotels, where more people have access.

The Wynn hotels in Vegas announced in December 2016 that they'd have an Echo in every room. While I don't hate the idea of controlling the lights and window shades with my voice, a hotel room is exactly the sort of place I wouldn't trust this sort of thing. But on the other hand, I also have no idea if a casino hotel — which already is wired up more than just about any other place you can visit without a security clearance — isn't already listening in on everything I do.

Pick your poison, really.

A potentially hacked Echo in a hotel room? That's another story.

So, yeah. This is an interesting potential exploit. But it's one that requires me to have an older Amazon Echo. At home, that's something I can rectify myself. (Get one that does not have model number 23-002518-01.) It also requires an attacker to have physical access to my Echo, which again is way worse for me for a host of other reasons.

And, finally (or, rather, first) it requires me to be a target. This isn't something you can just stumble across walking down the street or logging onto someone's Wifi network.

For now? I'm just a guy with an Amazon Echo who's still going to sleep just fine at night.