The “smart” personal assistant: it’s one of the hottest new tech trends. But can we trust devices like Amazon’s Alexa or Google Home? The story of one Portland, Oregon, family points to lingering and serious security challenges surrounding these unique devices.

If you’re not familiar with them, virtual assistants like Alexa and Google Home are designed to help with a variety of simple tasks using voice activation technology. Although these devices can work like your typical portable speaker, playing music and podcasts, they’re also designed to help answer questions like “What will the weather be like today?” and “Where’s the nearest pizza place?” Personal assistants can also help place calls, schedule meetings, and make adjustments to your calendar.

For many busy households, being able to carry out these simple yet important tasks using only a voice is quite convenient and, in some cases, can help save time. But for some others the privacy and security challenges have made investing in one of these devices a little off-putting.

Take, for example, the story of a Portland family that was recently startled by the finding that their personal Alexa unit had recorded their conversation about hardwood floors before sending a record of that chat to a co-worker. The family only learned of the matter when the co-worker called to inform them.

The Portland woman whose conversation was recorded — and who has chosen to remain anonymous — said she “felt invaded” by the episode. In response, she says she’s “never plugging that device in again, because I can’t trust it.”

For its part, Amazon has verified the episode took place and says the recording occurred because the device heard its name or some other so-called “wake word” — a word selected by the users to get the device’s attention. It’s possible the family in question created a special wake word and uttered it during their conversation about hardwood floors.

However, the fact that Alexa sent that discussion to someone else is even more remarkable, and disturbing. According to Amazon, that transmission of information occurred because the conversation was understood by Alexa to be a “send message,” with something in that discussion resembling the name of the co-worker who ultimately received a record of the chat.

Although it tried to explain the problem, Amazon hasn’t denied the affair warrants investigation. The company says it is currently “evaluating options to make this case even less likely” in the future.

Despite growing privacy concerns about personal assistants, Amazon says its user base for Alexa continues to grow. The company noted that tens of millions of Alexa units were purchased over the holiday season, and that millions of Amazon customers regularly use the device for everything from planning their day to making online purchases.