NEW YORK • The scene opened on a room with a red sofa, a potted plant and the kind of bland modern art one would see on a therapist's wall.
In the room was Mrs Michelle Obama or someone who looked exactly like her. Wearing a low-cut top with a black bra visible underneath, she writhed lustily for the camera and flashed her unmistakable smile.
Then, the former United States first lady's doppelganger began to strip.
The video, which appeared on online forum Reddit, was what is known as a "deepfake" - an ultra-realistic fake video made with artificial intelligence software.
It was created using a program called FakeApp, which superimposed Mrs Obama's face onto the body of a pornographic film actress.
The hybrid was uncanny - if one did not know better, one might have thought it was her.
Until recently, a realistic computer-generated video was a laborious pursuit available only to big-budget Hollywood productions or cutting-edge researchers.
But in recent months, a community of hobbyists has begun experimenting with more powerful tools, including FakeApp - a program built by an anonymous developer using open-source software written by Google.
FakeApp makes it free and relatively easy to create realistic face swops and leave few traces of manipulation. Since a version of the app appeared on Reddit in January, it has been downloaded more than 120,000 times, according to its creator.
Deepfakes are one of the newest forms of digital-media manipulation and one of the most obviously mischief-prone. It is not hard to imagine this technology being used to smear politicians, create counterfeit revenge porn or frame people for crimes. Lawmakers have already begun to worry about how deepfakes could be used for political sabotage and propaganda.
Even on morally lax sites such as Reddit, deepfakes have raised eyebrows. Recently, FakeApp set off a panic after technology site Motherboard reported that people were using it to create pornographic deepfakes of celebrities.
Pornhub, Twitter and other sites quickly banned the videos and Reddit closed a handful of deepfake groups, including one with nearly 100,000 members.