+ All Categories
Home > Documents > D04.02 – 31 2014 – Monthly Evaluation Report · 2017-10-03 · D04.02 – 31-03-2014 –...

D04.02 – 31 2014 – Monthly Evaluation Report · 2017-10-03 · D04.02 – 31-03-2014 –...

Date post: 24-Jul-2020
Category:
Upload: others
View: 0 times
Download: 0 times
Share this document with a friend
35
DG DIGIT / ISA Programme D04.02 – 31-03-2014 – Monthly Evaluation Report Action 4.2.4 European Federated Interoperability Repository Specific Contract 13 within Framework Contract DI/07171 – Lot 2
Transcript
Page 1: D04.02 – 31 2014 – Monthly Evaluation Report · 2017-10-03 · D04.02 – 31-03-2014 – Monthly Evaluation Report Page 1 of 29 1 INTRODUCTION 1.1 Context The report is prepared

DG DIGIT / ISA Programme

D04.02 – 31-03-2014 – Monthly Evaluation Report

Action 4.2.4 European Federated Interoperability Repository

Specific Contract 13 within Framework Contract DI/07171 – Lot 2

Page 2: D04.02 – 31 2014 – Monthly Evaluation Report · 2017-10-03 · D04.02 – 31-03-2014 – Monthly Evaluation Report Page 1 of 29 1 INTRODUCTION 1.1 Context The report is prepared

D04.02 – 31-03-2014 – Monthly Evaluation Report

16/05/2014 Page i

Document Metadata

Property Value

Release date 2014-04-10

Status Acceptance

Version 1.00

Authors

Romain Prudhomme – PwC EU Services

Joan Bremers – PwC EU Services

Nikolaos Loutas – PwC EU Services

Reviewed by Pieter Breyne – PwC EU Services

Approved by Szabolcs Szekacs – DIGIT B2

Document History

Version Date Description Action

0.01 2014-04-01 Initial draft Creation

0.02 2014-04-02 Update after comments Update

0.03 2014-04-10 Delivered for review Review

1.00 2014-05-16 Accepted Acceptance

Page 3: D04.02 – 31 2014 – Monthly Evaluation Report · 2017-10-03 · D04.02 – 31-03-2014 – Monthly Evaluation Report Page 1 of 29 1 INTRODUCTION 1.1 Context The report is prepared

D04.02 – 31-03-2014 – Monthly Evaluation Report

16/05/2014 Page ii

Table of Contents

1 INTRODUCTION ................................................................................... 1

1.1 CONTEXT........................................................................................... 1 1.2 OBJECTIVE ......................................................................................... 1 1.3 SCOPE .............................................................................................. 1 1.4 STRUCTURE ........................................................................................ 1 1.5 GLOSSARY ......................................................................................... 2

2 WEB PERFORMANCE METRICS .............................................................. 3

2.1 UNIQUE VISITORS ................................................................................ 3 2.2 TOTAL VISITS ..................................................................................... 4 2.3 TOTAL VISITS ..................................................................................... 4 2.4 TOTAL PAGE VIEWS - CATALOGUE .............................................................. 7 2.5 TOTAL PAGE VIEWS – MOST VIEWED INTEROPERABILITY SOLUTIONS ...................... 8 2.6 PAGE VIEWS PER VISIT .......................................................................... 13 2.7 AVERAGE VISIT DURATION ...................................................................... 13 2.8 PAGE VIEWS PER VISIT AND AVERAGE VISIT DURATION PER CHANNEL SOURCE .......... 13 2.9 BOUNCE RATE .................................................................................... 14 2.10 NEW VS. RETURNING VISITOR ................................................................. 15 2.11 METADATA PAGE VIEWS ......................................................................... 15 2.12 TOP REFERRING SEARCH TERMS ............................................................... 16 2.13 TOP SEARCH TERMS (INTERNAL SEARCH) ..................................................... 17 2.14 EXITS TO A WEBSITE OF A FEDERATION PARTNER ........................................... 19 2.15 DOWNLOADS ..................................................................................... 20

3 GOALS ................................................................................................ 20

3.1 GOAL 1: EXTERNAL SITE/SEARCH ENGINE -> CATALOGUE -> ASSET PAGE .............. 21 3.2 GOAL 2 – EXTERNAL SITE/SEARCH ENGINE -> CATALOGUE -> ASSET PAGE ->

DOWNLOAD/OUTBOUND LINK ............................................................................ 21 3.3 GOAL 3 - EXTERNAL SITE/SEARCH ENGINE -> ASSET PAGE ................................ 22 3.4 GOAL 4 – CATALOGUE -> ASSET PAGE -> DOWNLOAD/OUTBOUND LINK ............... 23 3.5 GOAL 5 – CATALOGUE -> ASSET PAGE ....................................................... 23 3.6 GOAL 6 – CATALOGUE -> ASSET PAGE -> DOWNLOAD/OUTBOUND LINK ............... 24 3.7 GOAL 7 – ASSET PAGE -> DOWNLOAD/OUTBOUND LINK .................................. 25

4 CONCLUSION ..................................................................................... 26

4.1 IMPORTANCE OF EFIR FOR THE PLATFORM ................................................... 26 4.2 REAL DIFFERENCE BETWEEN “VISITORS” AND “USERS” ..................................... 26 4.3 COMPARISON WITH PREVIOUS MONTH ........................................................ 26 4.4 GOOD SEARCH ENGINE OPTIMISATION OF ASSET AND SOFTWARE PAGES ............... 27 4.5 KEY QUESTIONS .................................................................................. 27

4.5.1 WHAT are the most popular paths to downloading an interoperability

solution (so-called asset release on Joinup)? ............................................. 27 4.5.2 WHICH assets, asset releases and repositories are the most popular

(most visited, with the highest number of downloads) and WHY? ................. 27 4.5.3 HOW long and HOW often do people browse the repository? .......... 28 4.5.4 WHAT is the geographic distribution of the users of EFIR?.............. 28 4.5.5 HOW do people experience the visit to the catalogue and the

catalogue itself? ..................................................................................... 28 4.5.6 HOW do people use the search functionalities of EFIR (focusing on

the advanced search)?............................................................................ 28

ANNEX 1 – IN-PAGE ANALYTICS ............................................................... 29

Page 4: D04.02 – 31 2014 – Monthly Evaluation Report · 2017-10-03 · D04.02 – 31-03-2014 – Monthly Evaluation Report Page 1 of 29 1 INTRODUCTION 1.1 Context The report is prepared

D04.02 – 31-03-2014 – Monthly Evaluation Report

16/05/2014 Page iii

List of Tables

Table 1 - Glossary ......................................................................................... 2

Page 5: D04.02 – 31 2014 – Monthly Evaluation Report · 2017-10-03 · D04.02 – 31-03-2014 – Monthly Evaluation Report Page 1 of 29 1 INTRODUCTION 1.1 Context The report is prepared

D04.02 – 31-03-2014 – Monthly Evaluation Report

16/05/2014 Page iv

List of Figures

Figure 1 - Overview of visits with language "c" (suspected bots).......................... 3 Figure 2- Unique visitors................................................................................. 4 Figure 3 - Detailed comparison of EFIR and EFIR Engaged Users ......................... 4 Figure 4 - Total visits ..................................................................................... 4 Figure 5 - Europe and World map .................................................................... 5 Figure 6 - Top 5 non-EU countries per visit ....................................................... 5 Figure 7 - Detailed report of performance per EU country ................................... 6 Figure 8 - Catalogue usage comparison ............................................................ 7 Figure 9 - Catalogue usage ............................................................................. 7 Figure 10 - Top 10 interoperability solutions ..................................................... 8 Figure 11 - Top 10 federated repositories ......................................................... 9 Figure 12 - Top 10 federated projects ............................................................. 10 Figure 13 - Top 10 software solutions ............................................................. 11 Figure 14 - Top 10 federated forges ................................................................ 12 Figure 15 - Pages per visit comparison ............................................................ 13 Figure 16 - Average visit duration comparison .................................................. 13 Figure 17 - In-depth channel source analysis ................................................... 14 Figure 18 - Bounce rate ................................................................................. 14 Figure 19 - Percentage of new visits................................................................ 15 Figure 20 - New vs. Returning visitor average .................................................. 15 Figure 21 - Metadata page view comparison .................................................... 16 Figure 22 - Top 10 keywords .......................................................................... 16 Figure 23 - SEO results from Google Webmaster Tools ...................................... 17 Figure 24 - Internal search usage overview ...................................................... 18 Figure 25 - Top 10 internal search terms ......................................................... 18 Figure 26 - Advanced searches ....................................................................... 19 Figure 27 - Total number of Events (Downloads & Outbound links) ..................... 19 Figure 28 - Top 10 Outbound Links ................................................................. 19 Figure 29 - Top 25 downloads ........................................................................ 20 Figure 30 - EFIR Goal 1 ................................................................................. 21 Figure 31 - EFIR Goal 2 ................................................................................. 22 Figure 32 - EFIR Goal 3 ................................................................................. 22 Figure 33 - EFIR Goal 4 ................................................................................. 23 Figure 34 - EFIR Goal 5 ................................................................................. 24 Figure 35 - EFIR Goal 6 ................................................................................. 24 Figure 36 - EFIR Goal 7 ................................................................................. 25 Figure 37 - In-page analytics for Catalogue of Assets ........................................ 29 Figure 38 - In-page Analytics for Open-Source Software Catalogue ..................... 30

Page 6: D04.02 – 31 2014 – Monthly Evaluation Report · 2017-10-03 · D04.02 – 31-03-2014 – Monthly Evaluation Report Page 1 of 29 1 INTRODUCTION 1.1 Context The report is prepared

D04.02 – 31-03-2014 – Monthly Evaluation Report

Page 1 of 29

1 INTRODUCTION

1.1 Context

The report is prepared in the context of Action 4.2.4 European Federated

Interoperability Repository. A key to federating further national and local

repositories or standardisation bodies into the EFIR will be their representation on

the Joinup platform. The solutions will need to be described in a high quality,

informative manner and be easily re-usable.

Therefore, a formal evaluation of the performance of the Joinup platform will be

performed on a monthly basis, focussing on the EFIR activities. During the overall

project, from February till July 2014, a report will be published each month,

describing the performance of EFIR on Joinup during the reporting month, based on

the gathered metrics defined in the Sub-task 04.01. This report covers the month

of March 2014.

1.2 Objective

The objective of this evaluation reports is to provide the necessary data-based

evidence to the stakeholders of EFIR – both the interoperability solution providers

and re-users – upon which the value and future sustainability of the Repository can

be assessed.

To do so, the performance evaluation provided in the context of this project aims at

answering the following key questions:

1. WHAT are the most popular paths to downloading an interoperability

solution (so-called asset release on Joinup)?

2. WHICH assets, asset releases and repositories are the most popular (most

visited, with the highest number of downloads) and WHY?

3. HOW long and HOW often do people browse the repository?

4. WHAT is the geographic distribution of the users of EFIR?

5. HOW do people experience the visit to the catalogue and the catalogue

itself?

6. HOW do people use the search functionalities of EFIR (focusing on the

advanced search)?

1.3 Scope

The scope of this Evaluation Plan is the evaluation of the EFIR on the Joinup

platform. The other functionalities of the Joinup platform are out of scope for this

evaluation. Assets that do not qualify as interoperability solutions are also out of

scope.

1.4 Structure

The remainder of this deliverable is structured as follows:

Page 7: D04.02 – 31 2014 – Monthly Evaluation Report · 2017-10-03 · D04.02 – 31-03-2014 – Monthly Evaluation Report Page 1 of 29 1 INTRODUCTION 1.1 Context The report is prepared

D04.02 – 31-03-2014 – Monthly Evaluation Report

Page 2 of 29

In chapter 2 we present the web performance metrics;

In chapter 3 we present the results of the pre-defined goals; and

In chapter 4 we provide a conclusion on the results.

1.5 Glossary

Throughout this report, we will use the following terms defined by Google Analytics:

Table 1 - Glossary

Visit A visit refers to an active user browsing the Website browsing a

page. Each page browsed by a human counts as a visit; thus

search engine robots are not counted1.

Active user An active user is a user having a browsing session.

Browsing

session

A browsing session is session during which a user is considered

to be active on the Website. The duration is set to 30min and

expires if the cookies are cleared.

Unique visit A unique visit is defined per browser and per computer.

Technically, a unique visit is defined by Google Analytics using

the cookie _utma. Following this definition:

Closing or changing the browser/computer increases the number

of unique visits

Clearing the cookies increases the number of unique visits

1 https://support.google.com/analytics/answer/1315708?hl=en

Page 8: D04.02 – 31 2014 – Monthly Evaluation Report · 2017-10-03 · D04.02 – 31-03-2014 – Monthly Evaluation Report Page 1 of 29 1 INTRODUCTION 1.1 Context The report is prepared

D04.02 – 31-03-2014 – Monthly Evaluation Report

Page 3 of 29

2 WEB PERFORMANCE METRICS

In the context of this report, all metrics below have been gathered for the period

from March 1st, till March 31st.

In order to keep the metrics as accurate as possible, we excluded all data coming

from Zaventem, so as not to count the visits and page views of the PwC consultants

working on the Joinup platform.

Please note that due to the installation of a new Drupal module for Google

Analytics, and some initial technical difficulties, the on-site search data was only

gathered as from March 5th. Considering that we still have 26 days of data, we will

report on the on-site search usage, but we advise the reader to keep in mind these

missing days.

We noted a very high increase in the number of visits and visitors starting from

March 15th (almost double). After investigation, we found out that this increase was

vastly due to unique visitors, visiting only 1 page for a very short period of time

(less than 3 seconds), and having as a language “c” (please note that this language

does not correspond to any language in the standards ISO 3166 & ISO 639 used by

Google Analytics. “c” refers to the C locale, which is the default Unix locale, and is

usually the sign of a bot, scrapper or another tool running from a Unix command

line).

In order to keep the analysis relevant, we excluded visitors with language “c” from

the results presented below.

In Figure 1 is the overview of the bot behaviour:

Figure 1 - Overview of visits with language "c" (suspected bots)

2.1 Unique visitors

Figure 2 presents the monthly comparison (day by day) of the number of visitors

on the segments EFIR and EFIR Engaged Users.

Page 9: D04.02 – 31 2014 – Monthly Evaluation Report · 2017-10-03 · D04.02 – 31-03-2014 – Monthly Evaluation Report Page 1 of 29 1 INTRODUCTION 1.1 Context The report is prepared

D04.02 – 31-03-2014 – Monthly Evaluation Report

Page 4 of 29

Figure 2- Unique visitors

As shown in Figure 2, engaged users represent only about 15% of the visitors on

EFIR (slight decrease from 16.5% in February).

The EFIR segment represents 44.5% of all traffic on Joinup (from 43.44%), while

the EFIR Engaged User represents 6.73% (from 7.20%) of all traffic on Joinup. A

detailed comparison of both segments is presented in Figure 3.

Figure 3 - Detailed comparison of EFIR and EFIR Engaged Users

2.2 Total Visits

Figure 4 presents the monthly report (day by day) of visits on the EFIR segment.

Figure 4 - Total visits

Figure 4 shows that the total number of visits including EFIR pages is quite stable

over the month of March 2014: about 550 visits per day on weekdays and 200 per

day on weekends.

2.3 Total Visits

In this subsection, we present the monthly reporting on the location from which

visits to the EFIR segment came from. In Figure 5 you will find the maps of the

locations of visits on EFIR. The darkest blue represents a large number of visits; the

lightest blue represents a small number of visits.

Page 10: D04.02 – 31 2014 – Monthly Evaluation Report · 2017-10-03 · D04.02 – 31-03-2014 – Monthly Evaluation Report Page 1 of 29 1 INTRODUCTION 1.1 Context The report is prepared

D04.02 – 31-03-2014 – Monthly Evaluation Report

Page 5 of 29

Figure 5 - Europe and World map

In order to get a view on the usage of EFIR outside of the border of the European

Union, we present in Figure 6 a detailed report of the top 5 non-EU countries in

term of visit to the EFIR segment.

Figure 6 - Top 5 non-EU countries per visit

In order to get a view on the usage of the EFIR platform within the European Union,

you will find in Figure 7 detailed reporting on the performance of EFIR in the

different European countries.

Page 11: D04.02 – 31 2014 – Monthly Evaluation Report · 2017-10-03 · D04.02 – 31-03-2014 – Monthly Evaluation Report Page 1 of 29 1 INTRODUCTION 1.1 Context The report is prepared

D04.02 – 31-03-2014 – Monthly Evaluation Report

Page 6 of 29

Figure 7 - Detailed report of performance per EU country

Page 12: D04.02 – 31 2014 – Monthly Evaluation Report · 2017-10-03 · D04.02 – 31-03-2014 – Monthly Evaluation Report Page 1 of 29 1 INTRODUCTION 1.1 Context The report is prepared

D04.02 – 31-03-2014 – Monthly Evaluation Report

Page 7 of 29

2.4 Total page views - Catalogue

In Figure 8 we present a view on how many times each of the catalogue pages has

been consulted during the month of March 2014, for the EFIR segment (in blue)

compared with the EFIR Engaged User segment (in green).

Figure 8 - Catalogue usage comparison

We observed that despite the large difference in overall number of visitors between

EFIR and EFIR Engaged Users (presented in subsection 2.1), the difference in the

number of catalogue page views is significantly less important. This means that the

catalogue pages are heavily used by Engaged users.

In Figure 9 you will find the daily comparison of aggregated catalogues page views.

Figure 9 - Catalogue usage

Detailed in-Page analytics of the Catalogue of Assets (Figure 38) and the Catalogue

of Open-Source Software (Figure 39) are presented in Annex 1.

Page 13: D04.02 – 31 2014 – Monthly Evaluation Report · 2017-10-03 · D04.02 – 31-03-2014 – Monthly Evaluation Report Page 1 of 29 1 INTRODUCTION 1.1 Context The report is prepared

D04.02 – 31-03-2014 – Monthly Evaluation Report

Page 8 of 29

2.5 Total page views – most viewed interoperability solutions

Below we present this month’s most popular interoperability solutions (Figure 10),

federated repositories (Figure 11), federated projects (Figure 12), software

solutions (Figure 13) and federated forges (Figure 14) for the EFIR segment and

the EFIR engaged users segment.

Figure 10 - Top 10 interoperability solutions

Page 14: D04.02 – 31 2014 – Monthly Evaluation Report · 2017-10-03 · D04.02 – 31-03-2014 – Monthly Evaluation Report Page 1 of 29 1 INTRODUCTION 1.1 Context The report is prepared

D04.02 – 31-03-2014 – Monthly Evaluation Report

Page 9 of 29

Figure 11 - Top 10 federated repositories

Page 15: D04.02 – 31 2014 – Monthly Evaluation Report · 2017-10-03 · D04.02 – 31-03-2014 – Monthly Evaluation Report Page 1 of 29 1 INTRODUCTION 1.1 Context The report is prepared

D04.02 – 31-03-2014 – Monthly Evaluation Report

Page 10 of 29

Figure 12 - Top 10 federated projects

Page 16: D04.02 – 31 2014 – Monthly Evaluation Report · 2017-10-03 · D04.02 – 31-03-2014 – Monthly Evaluation Report Page 1 of 29 1 INTRODUCTION 1.1 Context The report is prepared

D04.02 – 31-03-2014 – Monthly Evaluation Report

Page 11 of 29

Figure 13 - Top 10 software solutions

Page 17: D04.02 – 31 2014 – Monthly Evaluation Report · 2017-10-03 · D04.02 – 31-03-2014 – Monthly Evaluation Report Page 1 of 29 1 INTRODUCTION 1.1 Context The report is prepared

D04.02 – 31-03-2014 – Monthly Evaluation Report

Page 12 of 29

Figure 14 - Top 10 federated forges

Page 18: D04.02 – 31 2014 – Monthly Evaluation Report · 2017-10-03 · D04.02 – 31-03-2014 – Monthly Evaluation Report Page 1 of 29 1 INTRODUCTION 1.1 Context The report is prepared

D04.02 – 31-03-2014 – Monthly Evaluation Report

Page 13 of 29

2.6 Page views per visit

In Figure 15 we present the monthly comparison (day by day) of the depth of visits

(number of pages viewed per visit) for each segment.

Figure 15 - Pages per visit comparison

Note that the EFIR segment has a bounce rate of about 50% (i.e. 50% of the visits

are comprised of only 1 page, the visitor then left). Non-bouncing EFIR visitors thus

have an average number of pages/visit equivalents to twice the value of the blue

line.

2.7 Average visit duration

In Figure 16 we present a monthly comparison (daily numbers) of the length of the

visits for the EFIR and the EFIR Engaged Users segments.

Figure 16 - Average visit duration comparison

This is where the difference between engaged users and non-engaged users is the

most significant: whereas we defined engaged users to be visitors spending more

than 180 seconds (3 minutes) on Joinup (and visiting more than 3 pages), we

obverse an average visit duration of nearly 18 minutes for that segment during the

month of March 2014. On the opposite side, the average duration of a visit for the

EFIR segment is 3:28 minutes, which is actually reduced to 33 seconds if we were

to remove the Engaged Users (which are included in the EFIR segment).

Even taking into consideration that half of these visitors are bounces, we can see a

clear gap between engaged users and non-engaged users.

2.8 Page views per visit and average visit duration per channel source

In Figure 17 we present the number of visits, the number of page viewed per visit

and the average duration of a visit, grouped by channel source (the source of the

visit) for the 5 main sources of traffic for EFIR.

Page 19: D04.02 – 31 2014 – Monthly Evaluation Report · 2017-10-03 · D04.02 – 31-03-2014 – Monthly Evaluation Report Page 1 of 29 1 INTRODUCTION 1.1 Context The report is prepared

D04.02 – 31-03-2014 – Monthly Evaluation Report

Page 14 of 29

Figure 17 - In-depth channel source analysis

The channel source (direct) means that the visitor directly entered Joinup’s URL in

the browser.

The main channel for the EFIR and EFIR Engaged Users segments is the search

engine of Google, which is the source of almost 50% of all traffic on Joinup.

We also observe a vast amount of traffic coming from agid.gov.it, which also partly

explains the high traffic received from Italy. This is an example of a good referral to

EFIR on that website.

2.9 Bounce rate

Figure 18 presents an analysis of the bouncing rate on the EFIR segment. By

bouncing we understand visitors coming on an EFIR page and leaving the website

without doing anything else.

Figure 18 - Bounce rate

Page 20: D04.02 – 31 2014 – Monthly Evaluation Report · 2017-10-03 · D04.02 – 31-03-2014 – Monthly Evaluation Report Page 1 of 29 1 INTRODUCTION 1.1 Context The report is prepared

D04.02 – 31-03-2014 – Monthly Evaluation Report

Page 15 of 29

Additionally, Figure 38 and Figure 39 in Annex 1 present the average bouncing rate

on the two catalogue pages for the month of March 2014.

2.10 New vs. Returning visitor

In Figure 19 we present the day-by-day evolution of the percentage of new visits.

Figure 19 - Percentage of new visits

As shown in the above figure, the percentage of new visits remains fairly constant

over the period and is only slightly higher during weekends.

Figure 20 presents the monthly average of new versus returning visitors.

Figure 20 - New vs. Returning visitor average

70% of new visitors mean that EFIR is currently quite successful in attracting new

visitors. However since the number of total visitors is stagnant, the high number of

new visitors means that EFIR currently has difficulty retaining users.

2.11 Metadata page views

Figure 21 presents a comparison of the number of page view for the metadata

pages of EFIR (metadata export functionality).

Page 21: D04.02 – 31 2014 – Monthly Evaluation Report · 2017-10-03 · D04.02 – 31-03-2014 – Monthly Evaluation Report Page 1 of 29 1 INTRODUCTION 1.1 Context The report is prepared

D04.02 – 31-03-2014 – Monthly Evaluation Report

Page 16 of 29

Figure 21 - Metadata page view comparison

Over the period of March 2014, no visitor has taken advantage of this service.

2.12 Top referring search terms

In Figure 22 we present the top 10 most popular keywords driving traffic to EFIR.

Figure 22 - Top 10 keywords

(Not provided) means that Google Analytics was not able to gather the queried

keywords from the search engine. In most cases this is due to the cookie policy of

the visitor or its browser.

In order to provide a more detailed view of the search keywords, we set up the

Google Webmaster Tools for Joinup and started gathering Google Search data,

which is presented in Figure 23.

Page 22: D04.02 – 31 2014 – Monthly Evaluation Report · 2017-10-03 · D04.02 – 31-03-2014 – Monthly Evaluation Report Page 1 of 29 1 INTRODUCTION 1.1 Context The report is prepared

D04.02 – 31-03-2014 – Monthly Evaluation Report

Page 17 of 29

Figure 23 - SEO results from Google Webmaster Tools

‘Impressions’ are the number of times Joinup appeared in the results of the query

in Google Search (for example Joinup appeared 2000 times in the results on a

Google search due to people searching semic on Google). One important remark is

that one search often results in several impressions (if I search “joinup” the first 3

results lead to Joinup and thus will be counted as 3 impressions).

‘Average position’ represents the position in which Joinup appeared in the results of

the query. One important note is that we are tracking https://joinup.eu.europa.eu,

and as a result Google will not count the average position if the result is

http://joinup.eu.europa.eu (most of the time the ‘https’ is first or second just

behind the ‘http’).

‘CTR’ is the ‘Click Through Rate’, meaning the percentage of people who clicked on

an impression and accessed Joinup.

It is important to note that we cannot distinguish EFIR and non-EFIR visits in the

SEO results, and that these results only take in consideration Google searches. It is

nonetheless a good way of representing the most popular keywords leading to

Joinup from a public search engine. For March, the Webmaster Tools were able to

gather query data for 81.48% of the visits coming from Google.

2.13 Top search terms (internal search)

In Figure 24 we present an overview of the internal search engine usage (as from

March 5th only). It shows that only 3% of EFIR visitors are using the internal search

tool.

Page 23: D04.02 – 31 2014 – Monthly Evaluation Report · 2017-10-03 · D04.02 – 31-03-2014 – Monthly Evaluation Report Page 1 of 29 1 INTRODUCTION 1.1 Context The report is prepared

D04.02 – 31-03-2014 – Monthly Evaluation Report

Page 18 of 29

Figure 24 - Internal search usage overview

It is interesting to see that only 10.62% of searches are resulting in an exit. In the

other cases, the visitor browsed through an average of 5.28 pages.

In Figure 25 we present detailed statistics on the top 10 internal search terms.

Figure 25 - Top 10 internal search terms

Figure 26 presents the advanced searches for the month of March 2014. Very few

visitors used the advance search capabilities.

Page 24: D04.02 – 31 2014 – Monthly Evaluation Report · 2017-10-03 · D04.02 – 31-03-2014 – Monthly Evaluation Report Page 1 of 29 1 INTRODUCTION 1.1 Context The report is prepared

D04.02 – 31-03-2014 – Monthly Evaluation Report

Page 19 of 29

Figure 26 - Advanced searches

2.14 Exits to a website of a federation partner

Figure 27 presents the total number of events during the month of March (day by

day). We defined downloads and clicks on outbound links (hyperlinks to partner

websites) as events in Google Analytics.

Figure 27 - Total number of Events (Downloads & Outbound links)

In Figure 28 we present the 10 most popular destinations for outbound links on

Joinup.

Figure 28 - Top 10 Outbound Links

During the month of March 2014, EFIR referred 1122 visits to the websites of

partners and publishers, thus providing added value and incentive for these

publishers to share their solutions on Joinup.

Page 25: D04.02 – 31 2014 – Monthly Evaluation Report · 2017-10-03 · D04.02 – 31-03-2014 – Monthly Evaluation Report Page 1 of 29 1 INTRODUCTION 1.1 Context The report is prepared

D04.02 – 31-03-2014 – Monthly Evaluation Report

Page 20 of 29

2.15 Downloads

Figure 29 presents the 25 most popular downloads (assets and software). During

the month of March 2014, 1566 assets and software were downloaded.

Figure 29 - Top 25 downloads

3 GOALS

In the Evaluation Plan we identified 7 behaviour goals in order to get an answer on

some of the evaluation questions.

It is important to note the following about the goals denomination:

- By catalogue, we understand any of the 5 catalogue pages:

o http://joinup.ec.europa.eu/catalogue/all*

o http://joinup.ec.europa.eu/software/all*

o http://joinup.ec.europa.eu/asset/all*

o http://joinup.ec.europa.eu/catalogue/repository

o http://joinup.ec.europa.eu/software/federated_forge

- By asset page, we understand any of the following page categories:

o /asset/…

o /asset_release/…

o /software/…

Page 26: D04.02 – 31 2014 – Monthly Evaluation Report · 2017-10-03 · D04.02 – 31-03-2014 – Monthly Evaluation Report Page 1 of 29 1 INTRODUCTION 1.1 Context The report is prepared

D04.02 – 31-03-2014 – Monthly Evaluation Report

Page 21 of 29

o /federated_projects/…

3.1 Goal 1: external site/search engine -> catalogue -> asset page

In Figure 30 we present the metrics about users who complete the pre-defined

goal: the user arrives on Joinup on a catalogue page via an external site or search

engine, and goes from that catalogue page to consulting an asset page (after

searching or browsing through the assets on the catalogue).

Figure 30 - EFIR Goal 1

The number of visitors demonstrating the behaviour of goal 1 is quite limited, and

remained constant when comparing February (178 over 25 days) and March (218

over 31 days). However, considering the average number of pages viewed per visit

(9.40 as a slight decrease from 10.04) and the average duration of each visit

(00:09:36 as a slight decrease from 00:10:25), we can conclude that these visitors

have really engaged with EFIR.

3.2 Goal 2 – external site/search engine -> catalogue -> asset page ->

Download/Outbound link

In Figure 31 we present the metrics about users who complete the pre-defined

goal: the user arrives on Joinup via an external site or search engine and lands

directly on a catalogue page, then browses to an asset page and downloads an

asset or software or clicks on a link towards a publisher’s website.

Page 27: D04.02 – 31 2014 – Monthly Evaluation Report · 2017-10-03 · D04.02 – 31-03-2014 – Monthly Evaluation Report Page 1 of 29 1 INTRODUCTION 1.1 Context The report is prepared

D04.02 – 31-03-2014 – Monthly Evaluation Report

Page 22 of 29

Figure 31 - EFIR Goal 2

As this goal is very restrictive, (the user has to land on a catalogue page), the

number of visitors is very limited. This is due to the fact that most assets and

software pages are easy to reach directly from an external search engine (as we

will see in Goal 3).

Nevertheless it is encouraging for the user-friendliness of EFIR to observe that 45%

of visits accomplishing goal 2 (which represents the ideal and most efficient usage

of the repository) are new visits. This means that the easiest path to the software

and assets is clear even for first-time visitors.

3.3 Goal 3 - external site/search engine -> asset page

In Figure 32 we present the metrics about users who complete the pre-defined

goal: the user arrives on Joinup via an external site or search engine and lands

directly on an asset page.

Figure 32 - EFIR Goal 3

We note an increase in the number of unique visitors accomplishing goal 3, from

7056 in February to 9145 in March.

When we compare goal 1 and 3 regarding the number of visits (259 vs. 11466) and

unique visitors (218 vs. 9145), we can conclude that most visitors are not arriving

Page 28: D04.02 – 31 2014 – Monthly Evaluation Report · 2017-10-03 · D04.02 – 31-03-2014 – Monthly Evaluation Report Page 1 of 29 1 INTRODUCTION 1.1 Context The report is prepared

D04.02 – 31-03-2014 – Monthly Evaluation Report

Page 23 of 29

on Joinup through the catalogue page, but rather arrive directly on an asset page.

Based on this we can conclude that:

- This means that the asset pages are well described and thus easily findable

through a standard search engine (like Google or Bing);

- The fact that EFIR’s interoperability solutions are easily findable on standard

search engine means that they are visible on these standard search engines.

This means added promotion for our publishers, thanks to their good

description on Joinup and the platform’s high ranking in standard search

engines;

- 9145 unique visitors represents about 40% of Joinup’s overall traffic (and

89% of EFIR’s), which allows us to say that the interoperability solutions

promoted by EFIR on Joinup are highly visible on the platform.

3.4 Goal 4 – catalogue -> asset page -> Download/Outbound link

In Figure 33 we present the metrics about users who complete the pre-defined

goal: the user arrives on Joinup via an external site or search engine, lands directly

on an asset page, and downloads an asset or software or clicks through to a link

towards a publisher’s website.

Figure 33 - EFIR Goal 4

When we compare goal 4 and goal 3, we note that out of the 9145 unique visitors

to asset pages, 1078 are downloading an asset or software, or going to the website

of a publisher/partner. This is a rather good result, especially if we factor in that

57% of the visits of goal 3 are bouncing: excluding those means that 1186 visits

out of 4930 ended up in a download or referral to a partner’s website, which is a

good ratio.

3.5 Goal 5 – catalogue -> asset page

In Figure 34 we present the metrics about users who complete the pre-defined

goal: the user arrives on a catalogue page (from anywhere, search engine, other

Page 29: D04.02 – 31 2014 – Monthly Evaluation Report · 2017-10-03 · D04.02 – 31-03-2014 – Monthly Evaluation Report Page 1 of 29 1 INTRODUCTION 1.1 Context The report is prepared

D04.02 – 31-03-2014 – Monthly Evaluation Report

Page 24 of 29

sections of Joinup …) and then goes directly to an asset page (after searching or

browsing through the assets on the catalogue).

Figure 34 - EFIR Goal 5

When we compare goal 1 and 5, we can conclude that most visitors using the

catalogue are doing so because they were already on Joinup (i.e. they did not land

on the catalogue, most likely they landed on a news item, the homepage, or a

community and then browsed internally to a catalogue page).

The fact that more than 42% of the visitors using the catalogue to access an asset

page are returning visitors (much higher than the EFIR average of 29%) is also a

good indication of the quality of the catalogue search functionalities, ease of access

and user-friendliness.

3.6 Goal 6 – catalogue -> asset page -> Download/Outbound link

In Figure 35 we present the metrics about users who complete the pre-defined

goal: the user arrives on a catalogue page (from anywhere, search engine, other

sections of Joinup …), then goes directly to an asset page (after searching or

browsing through the assets on the catalogue), and downloads an asset or software

or clicks through to a link of a publisher’s website.

Figure 35 - EFIR Goal 6

Page 30: D04.02 – 31 2014 – Monthly Evaluation Report · 2017-10-03 · D04.02 – 31-03-2014 – Monthly Evaluation Report Page 1 of 29 1 INTRODUCTION 1.1 Context The report is prepared

D04.02 – 31-03-2014 – Monthly Evaluation Report

Page 25 of 29

The comparison of the results of goal 5 and 6 brings some surprises: visitors to

asset pages, downloading assets or software, visit on average 6 more pages and

stay 7 more minutes than visitors to asset pages that do not download.

We have not yet identified an explanationfor that difference in behaviour.

By comparing goal 2 and 6, we also note that most visitors access the catalogue

pages from other locations on Joinup and not directly from an external source. This

might indicate a low presence of the catalogue pages in public search engine. By

basing ourselves on the fact that the numbers of goal 7 and goal 3 are much higher

than goal 6, we can definitely conclude that asset pages are appearing more often

that catalogue pages while searching on public search engines.

3.7 Goal 7 – Asset page -> Download/Outbound link

In Figure 36 we present the metrics about users who complete the pre-defined

goal: the user arrives on an asset page (from anywhere, search engine, catalogue,

other sections of Joinup …), and downloads an asset or software or clicks through

to a link towards a publisher’s website.

Figure 36 - EFIR Goal 7

By comparing goal 6 and goal 7, we note that most visitors do not use the

catalogue in order to access the asset or software they wish to download.

Page 31: D04.02 – 31 2014 – Monthly Evaluation Report · 2017-10-03 · D04.02 – 31-03-2014 – Monthly Evaluation Report Page 1 of 29 1 INTRODUCTION 1.1 Context The report is prepared

D04.02 – 31-03-2014 – Monthly Evaluation Report

Page 26 of 29

4 CONCLUSION

4.1 Importance of EFIR for the platform

Based on the gathered metrics, we can conclude the following about the importance

of EFIR for Joinup remains unchanged from February:

Overall EFIR represents 44.5% of all traffic on Joinup, meaning that the

published interoperability solutions are highly visible on Joinup;

The EFIR pages also have a significantly lower bounce rate than the average

Joinup pages (50% vs. 63%, taking in consideration that EFIR pages are

included in the general Joinup metric); and

EFIR’s visits in March were coming for about 69% from new visitors,

meaning that EFIR is successful in attracting new visitors.

4.2 Real difference between “visitors” and “users”

As in February, we can distinguish a difference between EFIR’s “visitors” and EFIR’s

“users”, as was shown by the tremendous gap between the segment EFIR Engaged

Users and the generic EFIR segment, in terms of pages/visit or average visit

duration.

4.3 Comparison with previous month

In Figure 37 we can observe the performance comparison between the month of

March 2014 (in blue) and February 2014 (in orange). Please keep in mind that the

reporting for February was done on only 25 days (for comparison of absolute

numbers, that means that the period of March 2014 was 24% longer).

Figure 37 - Performance of March vs. February

As can be seen above, EFIR performed better in March for quantitative metrics

(visits, unique visitors, pageviews), but worse for qualitative metrics (visit depth

(pages/visit), visit duration).

The result of the percentage of new visits should be interpreted carefully: due to

the slight increase of the amount of unique visitors per day, the number of new

Page 32: D04.02 – 31 2014 – Monthly Evaluation Report · 2017-10-03 · D04.02 – 31-03-2014 – Monthly Evaluation Report Page 1 of 29 1 INTRODUCTION 1.1 Context The report is prepared

D04.02 – 31-03-2014 – Monthly Evaluation Report

Page 27 of 29

visitors per day increased as well (from 285 in February to 299 in March), despite

their relative importance decreasing by -0,85%.

4.4 Good Search Engine Optimisation of asset and software pages

Thanks to the different results observed in the 7 goals, we can conclude that asset

and software pages can easily be reached directly from public search engine

without having to browse through catalogue or other pages:

- 193 visitors went to download an asset after passing through the catalogue

versus 1214 that didn’t pass through the catalogue.

- 1199 visitors passed through catalogue to access an asset page out of the

9145 visitors that didn’t pass through the catalogue.

- The catalogue pages on the other hand are quite seldom the landing page

for visits, thus making us question the referencing to the catalogue.

4.5 Key questions

4.5.1 WHAT are the most popular paths to downloading an interoperability solution

(so-called asset release on Joinup)?

Most visitors arrive directly on an interoperability solution from a search engine,

thanks to their good ranking in the standard search engines. The catalogue pages

are the least preferred options to arrive on an interoperability solution.

4.5.2 WHICH assets, asset releases and repositories are the most popular (most

visited, with the highest number of downloads) and WHY?

The most popular assets, asset releases and repositories are presented in sub

section 2.5.

In March, the most popular for consultation:

- Assets: DCAT Application Profile and ADMS

- Software: SD-DSS

- Federated repository: NIO portal (Slovenian National Interoperability

Framework) and EPSOS (European Patients Smart Open Services Projects)

- Federated forge: ES Technology Transfer Center and ADULLACT

In March, the most popular for downloads were:

- SD-DSS documentation and DSS packages

In March, EFIR referred most traffic to:

- http://ec.europa.eu(/isa)

- http://kmkey-es.blogspot.com.es

Page 33: D04.02 – 31 2014 – Monthly Evaluation Report · 2017-10-03 · D04.02 – 31-03-2014 – Monthly Evaluation Report Page 1 of 29 1 INTRODUCTION 1.1 Context The report is prepared

D04.02 – 31-03-2014 – Monthly Evaluation Report

Page 28 of 29

4.5.3 HOW long and HOW often do people browse the repository?

On average the visit lasts 3:28 minutes and 3.64 pages are viewed per visit.

However as explained sub section 4.2, there is a large gap between visitors and

users.

4.5.4 WHAT is the geographic distribution of the users of EFIR?

Out of the 13,000 visits to the repository in March, 8900 were coming from a

Member State. A detailed representation of the geographic distribution can be

found in Figure 7 for the detail for each Member State and in Figure 6 for the 5

non-member state countries generating the most visits to the repository.

4.5.5 HOW do people experience the visit to the catalogue and the catalogue itself?

The catalogue is mostly used by recurring users (45%), and only a very limited

number of visitors actually land on the catalogue while visiting the repository.

Visitors going through the catalogue are also staying longer on the repository (13

minutes on average) and browsing more pages (13.38).

4.5.6 HOW do people use the search functionalities of EFIR (focusing on the

advanced search)?

Most (72.41%) searches realized on Joinup are made by EFIR visitors. However this

still only represents about 3% of all visits. This can be partly explained by the fact

that most visitors arrive directly on the asset or software page that they wish to

visit, and thus did not need to use the internal search functionality.

A very limited number of visitors used the advanced search capabilities, and a high

number of those searches (75%) did not yield any results. (Refer to Figure 26).

Page 34: D04.02 – 31 2014 – Monthly Evaluation Report · 2017-10-03 · D04.02 – 31-03-2014 – Monthly Evaluation Report Page 1 of 29 1 INTRODUCTION 1.1 Context The report is prepared

D04.02 – 31-03-2014 – Monthly Evaluation Report

Page 29 of 29

ANNEX 1 – IN-PAGE ANALYTICS

Figure 38 - In-page analytics for Catalogue of Assets

Page 35: D04.02 – 31 2014 – Monthly Evaluation Report · 2017-10-03 · D04.02 – 31-03-2014 – Monthly Evaluation Report Page 1 of 29 1 INTRODUCTION 1.1 Context The report is prepared

D04.02 – 31-03-2014 – Monthly Evaluation Report

Page 30 of 29

Figure 39 - In-page Analytics for Open-Source Software Catalogue


Recommended