EIG 3 promotion: conclusions drawn

17 January 2020 / by Soizic Pénicaud, EIG team

On 15 November 2019, after 10 months of intense work and almost 18 months after the launching of the call for projects in order to establish the third public interest entrepreneurs promotion, the latter officially ended. We were proud to close this promotion at a public debriefing event on 12 November 2019, when the results of each challenge were presented.

We also wanted to assess the promotion’s collective results against the programme’s two ambitious objectives : produce useful, open source and sustainable tools within administrations, and contribute to the State’s digital transformation and the insourcing of digital skills.

What worked well? What would benefit from being improved or discarded?

The findings of this article are drawn from a variety of sources: anonymous questionnaires, workshops and data collected on challenges. These devices (and their limits) are described in detail at the end of the article.

People chatting in front of demonstration stands in a large room Challenges presented their achievements to visitors on 12 November 2019. For a summary of the event, see blog post on Etalab.

10 months later, what are the main findings in terms of digital transformation and tool production?

Insourcing of skills & digital acculturation

📍 What is important in the programme is the people. Almost all mentors indicate that recruiting rare profiles was a key factor in applying for the programme (the second factor was co-funding, for half the mentors).

This year, the promotion was successful in insourcing skills: 5 EIG teams were fully or partially hired by administrations backing challenges. ⅓ of EIGs continue to work for the administration after the programme, including in other administrations.

The sustainable recruitment of innovative profiles in administrations is also consolidated by the EIG programme receiving funding under the Finance Law. This will be used to set establish a fourth EIG promotion in 2020.

One mentor raised an unexpected effect of the programme for the State’s attractiveness as an employer: “EIGs inspire [other] applicants to join the team”.

📍 Acculturation of host administrations: more than half the challenges organised training sessions for the professions, in various topics - data science, geomatics, open data, semantic analysis, accessibility. EIGs also contributed to organizing meetups and facilitating workshops for services.

These training sessions provided acculturation in professions and digital skills. A mentor explained this as follows: “the role of UX designers is better understood in administrations and is more sought after by teams”. Another noted the “very important demystification of artificial intelligence and data science”.

📍 Mentors’ skills development: public servants who participated in the programme testified to an increase in skills, in particular outside their core business. For the most part, these were the skills of their EIGs: more than half said that they knew more about data science, one third more about development and one third more about design.

They also said that it was a rewarding professional experience. A mentor explained this as follows: “it’s an opportunity to manage a small, highly productive and competent data team that’s hard to do without afterwards”.

Two people discussing paper prototypes of an interface displayed on the wall Co-construction of prototypes for the ACOSS Platform challenge

Long-lasting tools and efforts made in terms of openness

Some results have also been achieved in challenges, in particular in terms of openness.

📍 Most of the tools developed will be put on a permanent footing in host administrations, in various forms. 5 tools have already been integrated into the information systems of administrations, and 9 more will be by March 2020.

📍 From the open source perspective, 10 challenges (out of 15) have online code repositories. 2 challenges (ExploCode and Open Chronic) have published reusable libraries (see here and here). Also of note is Plume’s cooperation with the Editoria open source community for the Court of Auditors’ drafting tool.

📍 In terms of open source data, 3 challenges (EIG Link, Open Chronic and CibNav) have opened up data sets. The [CartoBio] challenge (http://www.cartobio.org/#/), which makes organic farming land plot data available on demand, is planning an open data release in the near future. These results do not come up to our expectations. A few explanations: some challenges did not lend themselves to the opening up of data sets, while for others the required time and/or resources were not available. Finally, our coaching in this regard would perhaps benefit from being refined.

We do not focus here on the effectiveness and usefulness of achievements for each challenge. If you are interested in this topic, data are available on the individual pages of each challenge.

You can also find a description of each challenge and the next steps in the promotion feedback booklet.

What is the role of the EIG programme?

📍 Over the past 3 years we have established a large EIG programme community, the core of which is the current EIG promotion and former EIGs and mentors. It is supported, among other things, by the members of the panel, our partners and the members of the direction interministérielle du numérique (the State’s interministerial directorate of digital technology).

This community is seen to be essential by the promotion. An EIG explained: “I would not have applied without the community and I think the project probably wouldn’t have got this far without it: it’s more than essential”.

📍 The most popular parts of the coaching programme are in fact the collective formats : bootcamp, outside seminaries, group coaching sessions.

This community also provides for significant peer-to-peer learning: 25 EIGs said they had developed skills outside their core business. This included legal and administrative knowledge, technical skills imparted by other EIGs, and project management skills. A third of them believed that they had also improved technically.

📍 Also of note is the success of the “Bulletins” information-sharing tool, developed by the programme, which is considered to be very relevant. The weekly bulletins were read at least once a month by more than 80% of EIGs.

The EIG ecosystem extends beyond the promotion. The vast majority was thus in contact with members of the direction interministérielle du numérique during the year, including members of Etalab. These interactions were very much appreciated by EIGs.

📍 One of our greatest sources of pride: the satisfaction of EIGs and mentors. 75% of EIGs and the vast majority of mentors said that they were satisfied or very satisfied with the results of their project. 5 out of 12 mentors also said that the programme exceeded their expectations, as did 10 out of 27 EIGs.

A group of people posing for a group photoThe EIG 3 promotion (EIGs and mentors), at the feedback event on 12 November.

Ways to improve the EIG programme

1.Better support administrations in defining the skills they are looking for.

Overall, teams found that the projects were fairly well defined, or that any lack of definition allowed EIGs to take back ownership of the project (through suitable policy backing). However, some teams explained that they would have preferred more definition, in particular in terms of EIG profiles sought. We will pursue this avenue for future promotions. We also want to create more balanced promotions in terms of skills and subjects, with data scientists, designers and developers in equal proportions. This year, the including of more designers in the promotion and the success of the “Public interest designers” experiment confirmed the importance of design for public innovation.

2.Consolidate links with the community of former EIGs.

EIG 3 participants regretted having had little contact with EIGs of previous promotions. We are thus giving consideration to opportunities for inter-promotion interactions. Such opportunities include the creation by a few EIG alumni of the LEON association (open digital entrepreneurs), intended to bring together former EIGs and mentors.

3.Continue to adjust the coaching programme to the specificities of each promotion.

The collective coaching sessions were popular as a means of bringing the promotion together, but their format did not always live up to the expectations of EIGs and mentors. Two of the main suggestions expressed were to: implement more structured peer-to-peer learning formats (considered more useful than “top-down” actions) and involve more outside participants.

In addition, we still have to iterate on the tasks of the EIG Link. This third promotion saw the introduction of a second EIG Link designer, intended to consolidate the internal technical coaching that can be provided to EIGs. This introduction was appreciated. However, EIG Links have a large number of very different tasks, ranging from technical support to advice on strategy and the coordination of the EIG community, and including the production of tools. The breakdown between these roles will need to be clarified, in order to consolidate coaching effectiveness.

Finally, at what level should a “challenge” be coached? This year, we have reinforced the individual follow-up of each challenge through regular calls with teams. However, some EIGs have expressed the need to be able to interact with the coaching team without the presence of other EIGs in their challenge. This option, which would allow earlier detection of possible human issues at team level, has the disadvantage of being very time-consuming for the programme management team.

4.Better equip EIGs and mentors about the subjects of digital transformation and sustainability.

EIGs have noticed that a lot of information was sent all in one go, at times not necessarily optimal for the challenges. Mentors have expressed the wish to have more practical information on how to sustain their projects. The [documentation] (https://entrepreneur-interet-general.etalab.gouv.fr/blog/2019/09/12/documentation-programme.html) website that we structured this year can be further called on in the future to meet these needs by empowering access to information.

5.Consolidate the programme’s visibility for administrations and EIGs.

While the programme is now well established in the public innovation landscape, there are still a number of challenges facing us. How can we reach out to administrations unfamiliar with the digital transformation ecosystem and encourage them to apply? For EIGs, we want to consolidate our recruitment channels to reach out to more senior profiles. This year was also an opportunity to consolidate the visibility of the programme in France and internationally, and to establish a partnership with the Latitudes association.

And now?

In 2020, the Public Interest Entrepreneurs programme is continuing! It benefits from funding via the 2020 finance law. There are two objectives within this scope: recruit 50 EIGs in 2020 (for 20 to 25 projects), and put some of them on a permanent footing in the administration.

From the beginning of 2020, we have been working to establish an ambitious promotion 4, which is to start in September 2020.

In addition, we will continue to assess the programme about every six months to monitor progress of projects and EIGs.

We are very interested in sharing experiences. If you too are running an innovation programme, and you have advice or similar experience in relation to the above-mentioned avenues, our assessment systems or the promotion’s results, feel free to contact us at entrepreneur-interet-general@data.gouv.fr!


🧐 Focus on assessment methodology

In order to measure the results of the EIG 3 promotion, we used four devices:

  • A spreadsheet filled in by the EIGs of each challenge, listing factual data on the projects: putting achievements and jobs on a permanent footing, level of openness of source codes and data, training sessions organised, etc. ;
  • Anonymous questionnaires distributed to EIGs and mentors, relating to their motivations, learning, difficulties and lessons learned on the programme. These questionnaires were based on those developed to assess Promotions 1 and 2 (see our [analysis report] (https://entrepreneur-interet-general.etalab.gouv.fr/blog/2019/06/26/rapport-analyse-eig.html) of June 2019) ;
  • We distributed these documents on 7 November 2019, one week before the promotion’s official end (even though many challenges continued after 15 November). From July 2019 onwards, we also organized informal feedback workshops on the programme, held during the coaching sessions.
  • As part of the [Programme d’investissements d’avenir - Future Investment Programme] (https://www.gouvernement.fr/le-programme-d-investissements-d-avenir) funding, mentors are also invited to develop indicators specific to their project. They fill in these indicators on a self-assessment basis at the end of their promotion.
    Data from the spreadsheet and questionnaires are available online at data.gouv.fr.

This methodology provides us with relatively efficient quantitative and qualitative feedback on the programme. It has certain limitations:

  • Due to a lack of time and resources, we only interview the promotion’s EIGs and mentors, and do not visit the premises of the host administrations or users. It is difficult to measure the digital transformation of administrations or the satisfaction of users without seeing them directly.
  • Not all EIGs and mentors answered the questionnaire: 27 out of the 30 EIGs surveyed answered, as did 12 out of the 18 mentors. The figures and opinions obtained do not reflect developments after 7 November 2019 (date the questionnaires were filled in).
  • As it was undertaken exclusively in-house, this could cast certain doubts as to the impartiality in the interpretation and/or presentation of results. To address this, we also use other types of assessment methods.

Finally, we changed our assessment method between the first two promotions and the third promotion. This makes comparisons between promotions difficult. To address this, we hope to maintain this methodology for promotion 4. In addition, the results of the questionnaire corroborated and supported conclusions already drawn in previous reports (see the analysis report of the EIG 1 and 2 promotions) and blog posts (see the post on factors for the success of an EIG challenge).

Despite their limitations, these tools allow us to identify a number of key results for the current promotion and areas for improvement for the subsequent ones.