Understanding the Ratings
What do the scores mean?
A successful digital participation program depends on selecting the appropriate platform and employing the best strategies. Our ratings, along with our guide and online training, can help you select a suitable tool for your context and needs.
This page helps you to explore the ratings and find the tool that meets your needs. While you may have a tool in mind, we encourage you to explore all available options, keeping in sight the criteria that matter most to you. For example, if you are looking for a platform that doesn't require much technical support, start by checking the scores in the "capacity requirements to compare the different platforms".
Remember, a platform may be very strong in one criterion, such as cost, despite having a lower score in accessibility, while another with the same score might perform well based on completely different variables. Use our ratings as a starting point to explore a variety of options. Many platforms offer free trials or demos, allowing you to experience the tool before making a selection.
So, what do the scores mean? Our methodology is designed to evaluate tools based on six criteria, composed of different variables, and each criterion is scored out of 100 points.
This criterion has one variable: the annual cost of software, or how much the software costs without any additional costs.
100 points = Free software, most likely an open source tool
80 points = Between $1 to $1,000/year
60 points = Between $1,001 to $5,000/year,
40 points = Between $5,001 to $20,000/year,
20 points = Over $20,000/year
Information Unavailable means we could not ascertain the cost of the software for the platform.
Our capacity requirements criterion focuses on five variables related to the internal capacity requirements necessary to configure and use a platform for participatory processes. The variables are scored in three ways, on a scale of 1 to 5 (Tech expertise is required for configuration & tech expertise required for maintenance ), 1 to 3 (Tech support available.), and, finally, yes and no (Process planning guidance and Hosting capacity). Platforms that scored between 80 and 100 points (such as Adhocracy+, Cap Collectif, Civocracy, ConsultVox, CoUrbanize, Discuto, Your Priorities, Participation / Decision21, and Social Pinpoint) are easy to configure; you won’t be required to pay an expert to configure and maintain them; and they provide flexible hosting options for users. They also provide high-quality tech support for organizations using their tool, offering full-service support for tech and process.
This criterion also helps you consider the additional costs you may need for your program. For example, do you have to pay an additional fee for tech support, hosting, or hiring tech expertise to help set up and maintain the tool?
Our features criterion focuses on 13 features that are essential for running a participatory program using a digital tool. However, you might not require all the features listed; the features you need for your program depend on your context and program. This score is not an indication of the quality of the features offered by the platform. If you require a platform that meets a diverse set of needs and offers multiple features within the same platform, look for high-scoring platforms that have more features. If you're looking for platforms meeting more specific needs or situations, look for low-scoring platforms that are more focused on a certain feature.
In our digital participation guide, we note that the digital divide is a significant challenge for digital participation. Therefore, we urge people to consider the disparity of internet access, cost of data plans, access to up-to-date personal devices, and/or degree of familiarity with the necessary technology. When selecting a tool, you should look at whether it provides accessibility features to enable you to create a meaningful participatory process for all. In this criterion, platforms such as Your Priorities, Make.org, Decidim, Citizen Lab, CONSUL, Efalia Engage (Fluicity), and more are highly rated because they performed well across the seven outlined variables. They:
Have been used in more than 10 countries in various regions,
Offer functionality in at least six languages,
Provide accessibility features for people with disabilities, such as allowing clients to customize platforms for people with mobility, visual, and hearing disabilities,
Enable hybrid integration with in-person activities to ensure inclusivity for people who are offline or people who do not know how to use digital tools,
Are suitable for most-used browsers and technology requirements, meaning they can be used for varying degrees of high-tech to low-tech devices and old and new browsers,
Are suitable for communities with connectivity challenges, e.g, slower connection speeds,
Have features that are fully mobile-responsive, meaning all features are responsive and available for users using mobile devices.
To ensure that our audience considers the ethics of using digital tools for participation, our ratings use five indicators across the domain of digital ethics and transparency issues. In this criterion, the highly rated tools (such as Adhocracy+, Consider.it, CONSUL, Decidim, Loomio Make.org, Your Priorities, and others) received high scores for various reasons, including:
Being open-source, meaning the platform’s code is published under an open-source license rather than proprietary software,
Having extensive and high-quality data policies clearly outlining data policy transparency (i.e., do platforms inform users how they use the data?) and ethical use (i.e., do platforms sell user information to third parties?),
Offering comprehensive data protection mechanisms such as authentication and encryption, meaning they provide outstanding protection for data use, especially from data leaks,
Having clear guidelines and dedicated features for transparency of moderation, ensuring that organizations and users are protected against hate speech,
Providing raw data export service for their users, allowing them to download the data (keeping in mind the data policy and protection rules, e.g., exclusion of personal data of users).
In this criterion, we focus on evaluating the reliability, track record, and long-term viability of the tools. The tools Your Priorities, Loomio, Make.org, Munipolis, Ethelo, Decidim, Delib, CONSUL, Bang the Table / EngagementHQ all scored over 80 points in this criterion. We assessed all tools on three indicators:
Time available on the market: This indicates a strong track record, meaning the tool has been available for use for more than five years and is more likely to be sustainable. The developers have maintained and updated the tool and upgraded its features and support since its launch.
Diversity of contributors: In this variable, we considered whether the tool is reliable by evaluating the extent to which multiple actors, e.g., governments, civil society, and nongovernmental organizations (CSOs/NGOs), investors or donors, contribute to sustaining the platform — rather than just one company. Open source tools scored highly, as they have many contributors outside of the company or organization that runs them.
Profile and breakdown of institutional users: Platforms scored highly if they have at least 3 different clients (e.g., governments, CSOs/NGOs, universities and research organizations, businesses, etc.) This indicates adaptability and a history of successful use in different contexts by different users.
Prev: Evaluation criteria
Next: Ratings process