Contribution Efficiency System (CES) Review Report

:speech_balloon: Hello Safe community!
Through over 4 months of operating the Contribution Efficiency System (CES), we have had the privilege of interacting with almost everyone who makes OBRA a reality for Safe DAO, including ample time with the initiatives pushing the DAO forward. We’re grateful to have played a role in progressing OBRA thus far and hope to share many insights from our experiences as the CES program managers in this review.

Table of contents

  • I. Abstract
  • II. Background and Scope
  • III. Executive Summary
  • IV. Operations Review
  • V. Stakeholder Feedback & Program Outcomes
  • VII. Way Forward
  • VIII. Helpful Links
  • Acknowledgements

I. Abstract

At the end of 2023, SEP#8 ratified OBRA which set out to provide “a structured use of resources to ensure the most efficient and useful utilization of SafeDAO’s assets.” The OBRA framework emerging out of this proposal provided the necessary structure to ensure the correct allocation of Safe DAO’s resources, and the CES built atop OBRA helps ensure transparency and alignment with how those allocated resources are used.

Since April, the CES team from Areta has been operating the Contribution Efficiency System (CES) to ensure initiatives and key DAO stakeholders have the requisite support and resources to fulfill their objectives across onboarding, milestone-setting, tracking, reporting, and community interactions.

This CES Review Report intends to dive into all of the key facets of the CES, highlighting the operational elements, feedback received, and impact the program had in helping OBRA realize its overarching objectives.

II. Background and Scope

The CES was proposed during the first OBRA season in January 2024, aiming to improve governance efficiency and decision-making for Safe token holders. The CES, initially an enabling tool for OBRA, focused on transparency and accountability of initiatives by streamlining OBRA onboarding, milestone setting, progress tracking, and outcome reporting to SafeDAO.

From the passing of SEP17 CES in January, the team from Areta researched, analyzed, and designed a facilitatory program which we would continue to operate over four months to date. Throughout this period, we have been able to refine and automate many of the OBRA processes which initiatives touch as part of their participation in the program.

This report aims to provide a comprehensive overview of the CES’s operational processes, achievements, learnings, and outcomes during its initial 4-month period from January 2024. Accordingly, the scope of this report encompasses:

→ A detailed breakdown of program operational elements,
→ CES/OBRA feedback across various key stakeholders,
→ Key program learnings
→ Tangible OBRA improvement elements

III. Executive Summary

Since its inception in April 2024, the Contribution Efficiency System (CES) has facilitated the onboarding, milestone-alignment, tracking, and reporting of 12 OBRA initiatives. Beyond these core elements, the CES team has led community (token-holder) interaction points, gathered feedback, and aided in the implementation of tangible OBRA improvement amendments.

We’re optimistic the past four months of CES have helped create clearer and more robust structures within which OBRA initiatives feel supported and key stakeholders within the DAO feel informed. This Review Report aims to reflect on our experience managing the CES over the past few months, including insights, operational processes, feedback, learnings, and what may lie ahead.

IV. Operations Review

From the outset of the CES, the key functions of our operations fell under a handful of core categories. Beyond the aforementioned four pillars of grant programs that emerged from our efficiency mapping exercise, the role of the CES touched many of the main facets of OBRA including leading token holder touch-points and collecting stakeholder feedback. In guiding our program strategy and implementation, we stress-tested the logic of each design element in turn, asking first these outcome-focused questions which we used to inform different design choices.

Throughout this section, we will share the recipe book of the CES, including breaking down the program into these component parts and analyzing each of them in turn.

a. Stakeholder (Token-Holder) Interactions

Stakeholder (Token-Holder) Interactions

A large part of the CES’ impact results from the coordination and facilitation of communication between different OBRA stakeholders, and even beyond to the entire Safe DAO. The CES holds Telegram chats with each initiative, including participation from the Safe Governance team, consistently monitoring and attending to queries or concerns from OBRA projects. Additionally, the CES hosts a channel for all active initiatives to share OBRA-wide updates, answer questions, and facilitate engagement between projects and within the Safe ecosystem. In practice, we have found these CES channels to be overwhelmingly positive, interactive, and productive.

Further, the CES has played a lead role in organizing and facilitating OBRA updates to the broader Safe DAO. Most recently, we hosted a Community Call which beyond allowing the CES to provide an OBRA status report to the DAO, gave several OBRA initiatives the opportunity to share their progress and achievements.

In reality, the CES impact on the OBRA process began before the ratification of SEP17 as we fulfilled a sparring partner role to Safe governance during the OBRA design process. Specifically, the CES team from Areta helped create, workshop, and analyze the different OBRA KPIs and core strategy funding buckets.

b. Onboarding & Milestone-Setting

Onboarding & Milestone-Setting

A critical juncture in an initiative’s life cycle occurs after it has been approved and before it has begun working. In this phase, it is crucial to ensure that the initiative is well-supported in their transition to operations, but more importantly, that their initiatives goals and the minutiae of their KPIs/deliverables are aligned under the broader OBRA strategic objectives.

Once an initiative reaches the CES (after completing various Safe-side related onboarding requirements), we share a refined version of their proposal which requires them to ground their individual stated objectives within OBRAs goals, as well as clarify more details on success metrics and future plans. In this phase, the CES plays an active role as a sparring partner to the initiative, guiding them towards the most beneficial versions of each of their milestones.

This onboarding and milestone-setting session has also proven a valuable opportunity for initiatives to pose questions, concerns, or thoughts that may impact their output. You can view all the CES milestone documents here.

c. Tracking & Reporting

Tracking & Reporting

One of the foremost roles of the CES was to collect tracking updates from each initiative and report monthly updates back to the DAO in an easily digestible format. Roughly one week prior to reporting, we distribute an individualized Notion tracking template that captures the following details from each initiative:

→ Summary of achievements from the month
→ Milestones reached during the period
→ Challenges or blockers that arose and delayed/impacted progress
→ Milestone progress snapshot which communicated how far along each initiative is on each of its outlined milestones

We then carefully review each initiative’s reporting sheet, clarify any questions, note any incongruities, and leave our feedback on the progress before sharing with the Safe Governance team ahead of invoicing and payments. Our support as an initial filter & reporter has increasingly smoothed initiatives tracking, reporting, and funding experiences. Once we have collated all of the responses for a period, we synthesize the tracking updates into an easily digestible and succinct format to share with the DAO. While these reporting updates are shared once a month, we also maintain active reporting sheets that reflect the individual status of each OBRA initiative at any given point in their operation, including their funding overview, milestone progress, and historical reporting sheets. You can find the registry of OBRA reporting sheets here, and a collection of our reporting on Safe forums here.

d. Feedback & Review Process

Feedback & Review Process

Our unique position as holding direct and frequent interactions with basically every OBRA stakeholder afforded us the opportunity to consistently qualitatively iterate on CES program design elements as a result of first-hand feedback. One example of this was the implementation of Notion template tracking sheets for each reporting period, which was suggested by an initiative that struggled with the experience of Telegram forms. In practice, this implementation allowed us to host all tracking and reporting elements on one platform for ease of creation and consumption.

Beyond providing feedback and iterating on our own program, the CES also played a role in collecting feedback and strategizing on broader OBRA topics and design. One instance of our support was helping co-lead the OBRA Retro round which held in July and collected mid-way feedback from the first wave of OBRA initiatives.

e. OBRA Hub

OBRA Hub

One of the core elements of the CES, is our Contribution Efficiency System Notion Hub which hosts all of the most salient information regarding our program. Alongside all milestone documentation, tracking and reporting sheets, the CES Notion Hub holds all resources for those looking to learn more about the program, and OBRA itself.

V. Stakeholder Feedback & Program Outcomes

As part of our ‘Phase 1: Validation’ outlined in our proposal, we conducted an efficiency mapping exercise which represented our initial scoping of grant programs across the leading DAOs seeking to gather insights into the industry’s best practices, key design elements, and opportunities for improvement. Using the insights gathered from conversations with grant managers and recipients, as well as canvassing public materials on grant programs, we developed hypotheses on each of the four core elements of the CES: onboarding, milestone-setting, tracking, and reporting. While these hypotheses were invaluable in the efficiency mapping exercise, as well as designing and efficiently operating the CES for four months, we wanted to ensure the outcomes they led to, and thereby the hypotheses themselves, were appropriately considered by the relevant stakeholders of the DAO. Consequently, our hypothesis testing exercise, along with more general CES & OBRA related-feedback, has help guide to recent touch points (ie. surveys, questionnaires & conversations) we had with both active and prospective OBRA initiatives. Please find the takeaways from these interactions below:

Initiatives

Initiatives

As the participants who have effectively experienced each phase of external-facing CES processes, the OBRA initiatives are some of our highest context resources for feedback. As we approached the conclusion of the first iteration of the CES, we distributed a survey to initiatives soliciting their feedback on their experience with our program. While the below graphic accurately highlights the quantifiable outcomes of the initiative survey, there were also many valuable qualitative inputs, of which we’ll share some select highlights below:

Onboarding & Milestone-Setting

“We preferred the async format, which makes it easier to balance with other client work.”

CES Communication & Responsiveness

“CES communication is great, and they don’t take too long to get back to questions.”

General Feedback

“Overall the experience with CES was great with clear communication, responsiveness, and good reporting structure with proper tracking and info pages on Notion.”

“Would be great to have to help following the payment schedule after we submitted the invoice.”

OBRA Applicants

OBRA Applicants

Beyond those who were accepted to OBRA, prospective applicants are also a very relevant group of initiatives with deep insights into a key part of the OBRA process. Recognizing this, throughout our operations the CES team actively monitored and distributed feedback questionnaires to authors of proposals that fell short of ratifying on Snapshot for various reasons. The questionnaire centered on the OBRA application experience, and included the following questions:

→ Tell me about your experience submitting the proposal. What went well? What challenges did you face?
→ Can you share your experience submitting a proposal at a different organization or DAO? How did it compare?
→ Do you feel any aspects of the OBRA proposal process could be improved?

The responses have proven an extremely valuable place for insights and tangible improvement suggestions for the OBRA proposal and application process. Of the feedback, some key themes emerged which we have summarized succinctly below:

Lack of Feedback:

Several respondents noted the difficulty in obtaining feedback from delegates or the community. The absence of a public channel for communication and the lack of constructive feedback made it challenging to gauge the alignment of proposals with community needs.

  • Example: “There needs to be a way to receive formal constructive feedback from delegates in time…”

DAO Application Process Benchmarking:

Respondents frequently compared OBRA’s proposal process to that of other DAOs, noting that other organizations often had quicker turnaround times, simpler processes, and more interactive feedback mechanisms.

  • Example: "[In other DAOs] it is possible to receive relevant feedback on time and customize the proposal so that it has sufficient consensus to pass formal voting.”

Formalized RFP and Better Community Engagement: There was a consensus on the need for a more formalized RFP process and better community engagement throughout the proposal development and submission phases.

Example: “Implementing a more formalized Request for Proposals (RFP) process sponsored by the community and delegates…”

VI. Key Learnings

As aforementioned, given the nature of our role as intermediaries of key information and resources regarding OBRA processes and the initiatives/stakeholders the program comprises of, we have deep insights into the inner working of OBRA. Through both the collection of feedback and our own insights as operators, we have been able to maintain and grow a repository of tangible OBRA improvement elements. This OBRA Amendments repository is hosted on our Notion Hub to provide transparency to the nuanced changes being made to these key DAO processes. Furthermore, we shared key OBRA process learnings not only through the Notion Hub repository but also in frequent conversations and qualitative iterations aligned with Safe Governance throughout our operations.

Notably, part of these tangible improvement elements have already been covered and implemented by the Safe team, including pro-rata funding amounts and procedurally controlling for preemptive start times of initiatives. You can find the link to the forum post on these OBRA amendment updates here. Notably, there are other suggested improvement elements from CES that remain open to inclusion in any future versions, as OBRA continues to iterate and grow.

VII. Way Forward

Reflecting on the past 4 months of CES operations, and the feedback received from key stakeholders encourages us of the impact we played in helping OBRA achieve its ultimate goal of optimization and efficiency of resource allocation.

Beyond researching and setting up the initial program design, we always maintained a keen focus on optimizing and automating as many processes of the CES as possible, both to reduce undue load on initiatives and on our program itself. As a result, the CES is in a position to continue operations in a leaner capacity than prior. Given the transitionary period for OBRA, and the need for continuity for initiatives both within and entering OBRA now, we are looking to extend operations for an additional 3 months. Keep your eyes out for a follow-up proposal at some point this week.

VIII. Helpful Links

  • Initial Proposal. See here.
  • Snapshot Vote. See here.
  • CES Notion Hub. See here.
  • Efficiency Mapping. See here.
2 Likes