How we developed the ratings
We invited a diverse and independent group of digital participation experts to serve on the technology review committee that evaluated the leading digital participation platforms. Committee members did not directly interact with developers of the platforms as they evaluated the products. Besides their expertise, we prioritized a diversity of gender and geography so committee members would bring different perspectives on what is most important and helpful for people in various contexts. The six committee members are from Africa, Asia, Eastern Europe, Latin America, North America, and Western Europe.
This year, our ratings include 30 platforms, all of which were included in the 2022 edition. Some platforms have been removed from the list due to inactivity. The committee agreed upon the list of platforms. For the next edition, the committee is exploring new ways to include more platforms from underrepresented regions in the Global South.
Two of the committee members applied the evaluation criteria for each category to the platforms. When there was disagreement on the result, the committee members responsible for the rating discussed and reached a consensus on the final rating. Before the ratings were finalized, developers were provided with the draft version of the ratings of their platform and given the opportunity to complete any missing information or provide evidence on disputed ratings. None of the committee members knew the final overall rating until the total score was calculated.
User and developer survey
In addition to the expert review, the ratings were informed by surveys of both users and developers. The user survey was conducted online from November 9th to 23rd, 2023. It asked users of digital participation platforms to evaluate their experience based on defined criteria. The survey instrument was shared through our newsletter, social media, and our members and networks. We received 42 responses from 14 countries in South America, Africa, and Eastern and Western Europe. The respondents included staff members from local and federal governments, civil society organizations, universities, and businesses. Due to the limited response this year, we also used survey responses from 2022 to inform the ratings.
In addition, we surveyed platform developers seeking to collect useful missing information. The survey was open from November 9 to 23, 2023. We also extended our due diligence process to enable platform developers to view the draft ratings and add missing information. More than 60% of the platform developers engaged in the process.
It is important to note:
The ratings compare digital participation platforms across a range of criteria to help governments and organizations decide on a suitable platform for their needs
Ratings are not based on testing of platforms, but on an assessment of their features, pricing, track record and reliability, etc.
Committee members based their evaluations mainly on publicly available information on the platforms’ websites and elsewhere, in addition to their own expertise and responses to the developer and user surveys.
What changed from last year’s ratings
Changes | 2023 Ratings | 2024 Ratings |
---|---|---|
Average score |
In 2022 and 2023 platforms received a standardized score of all the criteria. |
We have removed the overall score this year, only presenting the standardized number for each criterion. |
Categories |
In 2023 platforms were divided into three categories: Platform categories
|
We addressed a mismatch between categories and rating criteria: In 2024, the committee decided to change the number of categories as well as reduce their significance in the presentation of the ratings, as it caused confusion. There are now only two categories, used for descriptive purposes only.
|
Airtable presentation |
In 2023, the presentation resulted in many issues, including some developers viewing and using the ratings as “rankings,” i.e., marketing their platform as the number 1/best platform on the ratings, though our tech expert committee recommends using them as noncomparative ratings since a platform’s usefulness is dependent on a number of factors. |
In 2024, the ratings presentation has been changed in the following ways:
|
Changes to the criteria |
|
|
Due diligence process |
In 2023, platform developers had one opportunity to share information with the committee. The information they shared via email and the developer survey was used to improve the ratings, especially in variables where the committee could not find information online. |
In the 2024 ratings, platform developers had two opportunities to share information in order to improve their scores.
|
Next: The Committee