AGP-104 approved development and execution of an App Mining program intended to reward Aragon App developers for publishing and maintaining applications based on usage KPIs.
The proposal offered an example set of KPIs to illustrate how we could define an Aragon App score and distribute a fixed budget amount between application maintainers. However, as indicated in the proposal, we want to have an extended discussion to make sure the metrics we choose are not easy to game and serve as a good proxy for the value the application is bringing to the Aragon community.
While the current budget for App Mining comes from Aragon’s treasury, it is important for the program to create more value for the network and ANT holders than it costs. To ensure this is the case we want to make sure the KPIs we pick either directly relate to usage fees that are collected and allocated to the program, or are strongly correlated with increasing price of ANT. From that perspective the following might be ideal:
- Apps used in organizations with installed fundraising weighted by ANT locked as collateral
- Apps used in organizations paying subscription fees to the Aragon Court
- Activity in organizations operating on Aragon Chain or Flora where transaction fees accrues value to ANT
However, these services have not launched so the initial version of App Mining will need to rely on KPIs which are more indirectly related to value accrual and adoption of the Aragon Platform:
- Apps used in organizations which hold ANT
- Apps used in organization weighted by Assets Under Management (AUM)
- Apps used in organizations weighted by Activity Volume
Organization KPIs -> App Scoring
The KPIs above approximately relate to how valuable an organization is the Aragon Network, but generally do not provide any insight into how valuable the Applications installed in an organization are to the particular Organization.
In order to generate an application score we can first create an organization score using some combination organization KPIs, then split and credit that to each install application to determine an Application Score across all organizations.
This approach seems the most straightforward and practical, but I am interested in suggestions which might provide a more direct way to measure the individual utility value of an Application.
Publisher Eligibility
In order to participate in the App Mining program we need to be able to associate a recipient address to send the App Mining payout to with a published APM package.
- Should this address be related to the actual publishing address used for APM?
- Should we provide a way for publishers to opt-out of App Mining?
- Can the Agent be used to publish to APM?
Blacklisting Organizations and Publishers
If we only use KPIs that are directly linked to quantifiable value accrual to ANT it may not be possible to game the App Mining program by skewing the organization KPIs.
However, if we use a metric like AUM or Activity Volume on Ethereum, it’s possible that we may see people creating fake applications and/or fake activity in order to secure an app mining reward. Initially we will manually review payouts and flag anything suspicious, ask for community input, and take action if appropriate. In the future, this process could be handled by the Aragon Court.
Proposed KPIs and implementation details
Initially I propose creating an organization score weighted based on the following KPIs:
KPI | Definition | Proposed Weight |
---|---|---|
ANT | Sum of ANT in held in each app in an organization. | 25% |
AUM (Assets Under Management) | Sum of the value in DAI of ETH, ANT, DAI, USDC held in each app in an organization. | 25% |
Activity | Sum of transactions involving any of the organizations apps in the last 90 day period | 50% |
To compute the organization score each KPI would be normalized using as a ratio across all organizations, then KPIs will be combined into a single score using a weighted average.
For example if an Org holds 100 ANT and the total amount of ANT held across all organizations is 1000 ANT, then the normalized ANT KPI would be 10%. The same process would be used to normalized AUM and Activity. To compute the organization score we would take ANT * .25 + AUM * .25 + Activity * .50. Let’s say the values for our example org are 10%, 20%, 30% respectively. The organization score would be 22.5%.
Then to compute the application score we would take the organization score and divide it by the number of applications installed in the organization, allocating the fractional amount to each installed application. We would compute a score for all applications regardless of whether the publisher has opted in to the App Mining program, but would exclude them publishers who have not opted in from payout calculation. If the example organization score is 22.5% and has 8 apps installed, the organization would contribute 2.81% to each app’s score.
Both Organization Scores and Application Scores will be re-calculated periodically and displayed on apiary.1hive.org.
App Mining Payouts will be calculated after each ANV, however the change in ANV schedule makes it unclear when the first payout should be scheduled. We will try and get App Mining ready as quickly as possible, but due to the change in ANV schedule the first payout will likely need to be deferred until ANV-6.
To compute payout amounts we will use the application score to sort applications into a ranked list. For each payout cycle, there is a determined “pot” of total earnings that will get paid to apps, set by AGP-104 at 100K ANT. The top app gets paid 20% of the total pot. So, for a pot of 100K ANT, the top app receives 20K ANT. The next app gets paid 20% of the remaining pot. The remaining pot is 80K, and 20% of that is 16K ANT. This process continues until either every app has been paid or the payout amount is below 200 ANT, whichever comes first.
This payout policy can be visualized as follows:
With this rule ~70% of the budget is allocated to the 5 highest ranked apps, and ~90% to the ten highest ranked apps, and with a budget of 100K ANT and a minimum payout of 200 ANT, payouts will be made to the top 21 apps.
An alternative approach which may result in a broader distribution of App Mining payouts would be to simply payout the App Mining budget proportional to the App Score, keeping the same minimum payout amount.