This article was first published on iExec - Medium
DApp of the Week #9 — NSFW (AI Trained Model Monetization)
Not Safe For Work is a commonly used app that filters out pictures that can’t be used for work. It can be used in situations such as when a user loads a new picture onto a web service, so the service can automatically validate whether it is appropriate to be shown to other users.
The application can use several machine learning models. For example, “Not safe for school” or even “Not safe for anyone” trained models enable the app to identify these kinds of pictures.
In iExec NSFW dapp, the requester (end-user) can actually choose which model he wants to use to filter his image. The default model is this standard model, historically maintained by Yahoo. The end-user workflow is pretty straight forward: you choose the dataset you’re going to use, you select a picture to be analyzed and run the app.
When you click on ‘Buy analysis’, you can see that, as an end-user, you pay 1 RLC directly to the dataset owner, and 5 RLC to the worker executing the task.
As an end-user, you never access to the whole dataset: you simply get the piece of data you need. This is an example, in its simplest form, of how secure ‘data renting’ works. Requesters can use valuable datasets without actually being able to inspect them.
In the end, users get their analysis result, while dataset owners, devs and workers all rewarded in RLC. My cat gets a rating of 0.019177 which means that it can be used safely and you can see all the details of this execution on the iExec Explorer:
Training a machine learning model is not ...
To keep reading, please go to the original article at:
iExec - Medium