1 00:00:05,538 --> 00:00:09,155 We've worked with a lot of bots in the last 10 years. 2 00:00:09,155 --> 00:00:16,622 And the last one is maybe the most popular and it's called the Random Darknet Shopper Bot. 3 00:00:16,933 --> 00:00:26,355 People also call him Randy. I don't know, normally, we try to give him a female name but this is what came out. 4 00:00:26,355 --> 00:00:33,644 And it's basically a bot who goes into the deep web so called dark net 5 00:00:33,640 --> 00:00:39,466 and has a small budget of $100 US dollars but converted into bitcoins, 6 00:00:39,460 --> 00:00:44,000 where it randomly chooses a product which is being sold there 7 00:00:44,000 --> 00:00:51,466 and orders it, buys it and addresses it directly to the address of the gallery or the exhibition space. 8 00:00:51,460 --> 00:00:59,688 So we tried to produce a piece which is out of our control, something which would live on its own, 9 00:00:59,688 --> 00:01:07,022 something which possess its own money. So the Random Darknet Shopper has like kind of his own money 10 00:01:07,022 --> 00:01:15,155 and the idea was to connect maybe the art space with deep web, and speak in the first sense about trust. 11 00:01:15,155 --> 00:01:21,688 How does trust work in a network where you don't know to whom you are speaking to? 12 00:01:21,688 --> 00:01:29,022 Because there is, like, this encryption layer between me as a buyer, and you as a seller. 13 00:01:29,333 --> 00:01:38,222 And we really wanted to see how those things work within a network which gives you anonymity. 14 00:01:38,400 --> 00:01:50,577 We like to also introduce an element of losing control. So these bots we create, like the Random Darknet Shopper. 15 00:01:50,570 --> 00:01:59,288 They have a certain randomness, or autonomy, to them, so they can be separate from us. 16 00:01:59,288 --> 00:02:06,400 They can take action separately from us. I think our main question there is always, 17 00:02:08,177 --> 00:02:16,355 how do you talk of the responsibility they have also in a society, for example? 18 00:02:16,355 --> 00:02:23,955 So I think that the bots we programme, that they're very low tech. They don't use a lot of resources, 19 00:02:23,955 --> 00:02:31,911 but we're actually living in a society where we're getting used to bots that do have some resources 20 00:02:31,911 --> 00:02:39,377 and impact on the systems they operate in like trading bots, or... I don't know... Self-driving cars. 21 00:02:39,555 --> 00:02:52,844 And, in a playful way, I think our bots raise questions that are then raised on a greater level by their cousins, 22 00:02:52,844 --> 00:02:55,333 which are not programmed by us. 23 00:02:55,333 --> 00:02:57,733 We like to play with those trust systems. 24 00:02:57,860 --> 00:03:05,911 We also heard that it might be possible that we called the police to show up because it's good for the story 25 00:03:05,911 --> 00:03:08,622 You mean with the Random Darknet Shopper? 26 00:03:08,622 --> 00:03:14,400 I like that, because yes! It would have been a fucking intelligent thing to do. 27 00:03:14,400 --> 00:03:17,911 But it wasn't our idea. It didn't occur to us. 28 00:03:17,911 --> 00:03:27,600 But this is a sense of humour I really like. I think we learn a lot from online communities about that kind of humour. 29 00:03:27,600 --> 00:03:37,288 We like troll culture. We're part of it somehow. But we never want to expose people, we want to expose systems.