Public sector information, open data and big data (RP2020)

Policy and legislation

Policy objectives

With the continuously growing amount of data (often referred to as ‘big data’) and the increasing amount of open data, interoperability is increasingly a key issue in exploiting the value of this data.

Standardisation at different levels (such as metadata schemata, data representation formats and licensing conditions of open data) is essential to enable broad data integration, data exchange and interoperability with the overall goal of fostering innovation based on data. This refers to all types of (multilingual) data, including both structured and unstructured data, and data from different domains as diverse as geospatial data, statistical data, weather data, public sector information (PSI) and research data (see also the rolling plan contribution on ‘e-Infrastructures for data and computing-intensive science’), to name just a few.

EC perspective and progress report

Overall, the application of standard and shared formats and protocols for gathering and processing data from different sources in a coherent and interoperable manner across sectors and vertical markets should be encouraged, for example in R&D&I projects and in the EU open data portal (https://data.europa.eu/euodp) and the European data portal (https://data.europa.eu/europeandataportal ).

Studies conducted for the European Commission showed that businesses and citizens were facing difficulties in finding and re-using public sector information. The Communication on Open data states that “the availability of the information in a machine-readable format and a thin layer of commonly agreed metadata could facilitate data cross-reference and interoperability and therefore considerably enhance its value for reuse”. [18]

A common standard for the referencing of open data in the European open data portals would be useful. A candidate for a common standard in this area is the Application Profile for data portals in Europe (DCAT) and the FIWARE open stack-based specification and open standards APIs [19]. The FIWARE solution has now been integrated into the Connecting Europe Facility “Context Broker” building block (https://ec.europa.eu/cefdigital/wiki/display/CEFDIGITAL/Context+Broker). The CEF has agreed meanwhile to upgrade the “Context Broker” to use the ETSI NGSI-LD specification (ETSI GS 009 V1.3.1 of the NGSI-LD API), and also the FIWARE Foundation is evolving its API to the same ETSI standard for exchange of open data. Now further effort is needed to demonstrate good examples of proper usage of NGSI-LD. This has been promoted within the EC  Large Scale Pilot project SynchroniCity, however more dissemination and training is required (as recognized by CEF efforts for promoting training webinars). 

The DCAT Application Profile has been developed as a common project from the ISA2 programme, the Publications Office (PO) and CNECT to describe public-sector data catalogues and datasets and to promote the specification to be used by data portals across Europe. Agreeing on a common application profile and promoting this among the Member States is substantially improving the interoperability among data catalogues and the data exchange between Member States. The DCAT-AP is the specification used by the European Data Portal, which is part of the Connecting Europe Facility infrastructure, as well as by a growing number of Member States open data portals. The DCAT-AP related work, including its extensions to geospatial data (GeoDCAT-AP) and statistical data (StatDCAT-AP) also highlights the need for further work on the core standard. These are topics for the W3C smart descriptions & smarter vocabularies (SDSVoc) under the VRE4EIC (Project https://www.w3.org/2016/11/sdsvoc/) . Core Vocabularies (i.e., Core Person, Core Organization, Core Location, Core Public Event, Core Criterion and Core  Evidence), Core Public Service Application Profile and Asset Description Metadata Schema (for describing reusable solutions), implemented by the ISA2 program, solve the problem of data exchange and interoperability by using uniform data representation formats. They are currently used in the TOOP-OOP (Once-Only Principle) project which acts as forerunner for Single Digital Gateway Regulation.

 The mapping of existing relevant standards for a number of big data areas would be beneficial. Moreover, it might be useful to identify European clusters of industries that are with sufficiently similar activities to develop data standards. Especially for open data, the topics of data provenance and licensing (for example the potential of machine-readable licences) need to be addressed, as encouraged in the current and proposed revision of the PSI Directive (see section B.1).

 The PSI Directive encourages the use of standard licences which must be available in digital format and be processed electronically (Article 8(2)). Furthermore, the Directive encourages the use of open licences available online, which should eventually become common practice across the EU (Recital 26). In addition, to help Member States transpose the revised provisions, the Commission adopted guidelines [20] which recommend the use of such standard open licences for the reuse of PSI.

On 25 April 2018, the Commission adopted the ‘data package’ — a set of measures to improve the availability and re-usability of data, in particular publicly held or publicly funded data, including government data and publicly funded research results, and to foster data sharing in business-to-business (B2B) and business-to-government (B2G) settings. The availability of data is essential so that companies can leverage on the potential of data-driven innovation or develop solutions using artificial intelligence.

Key elements of the package are:

  1. The adoption of the Directive on open data and the re-use of public sector information (recast of Directive 2003/98/EC amended by Directive 2013/37/EU):
  • enhancing access to and re-use of real-time data notably with the help of Application Programming Interfaces (APIs);
  • lowering charges for the re-use of public sector information by limiting exceptions to the default upper limit of marginal cost of dissemination and by specifying certain high-value data sets which should be made available for free (via implementing acts);
  • allowing for the re-use of new types of data, including data held by public undertakings in the transport and utilities sector and data resulting from publicly funded research;
  • minimising the risk of excessive first-mover advantage in regard to certain data, which could benefit large companies and thereby limit the number of potential re-users of the data in question;
  • defining through an implementing act a list of “high value datasets” belonging to six thematic categories (geospatial, earth observation and environment, meteorological, statistics, companies and company ownership, mobility) to be made available mandatorily free of charge, in machine readable format and through APIs.

The new Directive (EU) 2019/1024 of the European Parliament and of the Council of 20 June 2019 on open data and the re-use of public sector information has been adopted and published on 26 June 2019. It must be transposed into national legislation by 17 July 2021 (https://eur-lex.europa.eu/legal-content/EN/TXT/?qid=1561993244509&uri=CELEX:32019L1024)

Article 14 of the open data Directive empowers the Commission to adopt implementing acts laying down a list of specific high value datasets belonging to the six thematic categories set out in the Annex and held by public sector bodies and public undertakings. In order to make the reuse of these datasets more efficient, the Directive provides that they shall be available for free, machine-readable, provided via APIs and, where relevant, as a bulk download. The implementing acts may also specify the arrangements for the publication and re-use of high value datasets, which shall be compatible with open standard licences. They may include terms applicable to re-use, formats of data and metadata and technical arrangements for dissemination. Work on the definition of high value datasets, including an impact assessment study, stakeholder consultations and a public consultation and hearing, will take place in 2020. An implementing regulation on high value datasets is expected to be adopted in the first half of 2021.

2. Review of the 2012 Recommendation on access to and preservation of scientific information, focusing on:

  • evaluating the uptake of the 2012 Recommendation as well as its effectiveness in creating a level playing field for Member States, researchers and academic institutions;
  • updating and reinforcing the overall policy with the development of guidelines on opening up research data and the creation of incentive schemes for researchers sharing data;
  • ensuring coherence with the European Open Science Cloud.

3. Development of guidance on private sector data sharing

The Commission has proposed guidance to companies that wish to make data available to other companies or to public authorities, which lays down principles of fair data sharing practices and includes guidance on legal, business and technical aspects of B2B and B2G data sharing.

Following an open selection process, the Commission appointed in November 2018 23 experts to an Expert Group on Business-to-Government Data Sharing. The conclusions and recommendations of the Expert Group will be published in a report, expected to be released by the end of 2019; the report will be used as input for possible future Commission initiatives on B2G data sharing (https://ec.europa.eu/digital-single-market/en/news/meetings-expert-group-business-government-data-sharing).

References 

  • COM(2014) 442 Towards a thriving data-driven economy
  • COM(2016) 176 ICT Standardisation Priorities for the Digital Single Market
  • COM(2017) 9 final Building a European Data Economy: A Communication on Building a European Data Economy was adopted on 10 January 2017. This Communication explores the following issues: free flow of data; access and transfer in relation to machine generated data; liability and safety in the context of emerging technologies; and portability of non-personal data, interoperability and standards. Together with the Communication the Commission has launched a public consultation.
  • Decision (EU) 2015/2240 on interoperability solutions and common frameworks for European public administrations, businesses and citizens (ISA2 programme) as a means for modernising the public sector (ISA2)
  • The PSI Directive (2013/37/EU) on the re-use of public sector information (Public Sector Information Directive) was published in the Official Journal on 27 June 2013. The Directive requests to make available for reuse PSI by default, preferably in machine-readable formats. All Member States transposed it into national legislation. 
  • COM(2011) 882 on Open data
  • COM(2011) 833 on the reuse of Commission documents
  • COM(2015)192 “A Digital single market strategy for Europe”
  • COM(2018)234 “Proposal for a Directive on the re-use of public sector information (recast)
  • C(2018) 2375 final “Recommendation on access to and preservation of scientific information”
  • COM(2018) 232 final “Communication Towards a common European data space”
  • COM(2020) 66 final “A European strategy for data”

Requested actions

The Communication on ICT Standardisation Priorities for the Digital Single Market proposes priority actions in the domain of Big Data. Actions mentioned herein below reflect some of them.

Action 1: Invite the CEN to support and assist the DCAT-AP standardisation process. DCAT-AP contains specifications for metadata records to meet the specific application needs of data portals in Europe while providing semantic interoperability with other applications on the basis of reuse of established controlled vocabularies (e.g. EuroVoc [21]) and mappings to existing metadata vocabularies (e.g. SDMX, INSPIRE metadata, Dublin Core, etc.). DCAT-AP and its extensions have been developed by multi-sectorial expert groups. Experts from international standardisation organisations participated in the group together with open data portal owners to ensure the interoperability of the resulting specification and to assist in its standardisation. These mappings have provided already a DCAT-AP extension to cover geospatial datasets, called Geo/DCAT-AP. The specification was developed under the coordination of the JRC team working on the implementation of the INSPIRE Directive. Another extension to describe statistical datasets, called Stat/DCAT-AP_ftnref2, was published end 2016.  This work has been coordinated by EUROSTAT and the Publications Office.

Action 2: Promote standardisation in/via the open data infrastructure, especially the European Data Portal being deployed in 2015-2020 as part of the digital service infrastructure under the Connecting Europe Facility programme,

Action 3: Support of standardisation activities at different levels: H2020 R&D&I activities; support for internationalisation of standardisation, in particular for the DCAT-AP specifications developed in the ISA2 programme (see also action 2 under eGovernment chapter), and for specifications developed under the Future Internet public-private-partnership, such as FIWARE NGSI-LD and FIWARE CKAN. Standardisation can also be enhanced by using Core Vocabularies, as well as Core Public Service Application Profile implemented by the ISA2 program.

Action 4: Bring the European data community together, including through the H2020 Big Data Value public-private partnership [22], to identify missing standards and design options for a big data reference architecture, taking into account existing international approaches, in particular the work in ISO/IEC JTC 1 SC 42. In general attention should be given to the four pillars of (semantic) discovery, privacy-by-design, accountability for data usage (licensing), and exchange of data together with its metadata, through the use of Asset Description Metadata Schema (for describing reusable solutions) implemented by the ISA2 program.

Action 5: Encourage the CEN to coordinate with the relevant W3C groups on preventing incompatible changes and on the conditions for availability of the standard(s), to standardise the DCAT-AP.

Action 6: The European Commission together with EU funded pilots and projects that develop technical specifications for the provision of cross-border services (e.g., from  ISA², CEF/DEP pilots, such as eSens on eDelivery), which need to be referenced in public procurement, to liaise with SDOs to consider how to address their possible standardisation.

Activities and additional information 

Related standardisation activities 

ETSI

ETSI’s oneM2M Partnership Project has specified the oneM2M Base Ontology (oneM2M TS-0012, ETSI TS 118 112) to enable syntactic and semantic interoperability for IoT data.

ETSI TC SmartM2M is developing a set of reference ontologies, mapped onto the oneM2M Base Ontology. This work has commenced with the SAREF ontology, for Smart Appliances, but is being extended to add semantic models for data associated with smart cities, industry and manufacturing, smart agriculture and the food chain, water, automotive, eHealth/aging well and wearables.

ETSI’s ISG for cross-cutting Context Information Management (CIM) has developed the NGSI-LD API (GS CIM 004 and GS CIM 009) which builds upon the work done by OMA Specworks and FIWARE. NGSI-LD is an open framework for exchange of contextual information for smart services, aligned with best practise in linked open data. Ongoing activities involve increased interoperability with oneM2M data sources and features to attest provenance of information as well as options for fine-grained encryption of information.

ETSI’s ISG MEC is developing a set of standardized Application Programming Interfaces (APIs) for Multi-Access Edge Computing (MEC). MEC technology offers IT service and Cloud computing capabilities at the edge of the network. Shifting processing power away from remote data centres and closer to the end user, it enables an environment that is characterized by proximity and ultra-low latency, and provides exposure to real-time network and context information.

ETSI’s TC ATTM committee has specified a set of KPIs for energy management for data centres (ETSI ES 205 200-2-1). These have been combined into a single global KPI for data centres, called DCEM, by ETSI’s ISG on Operational energy Efficiency for Users (OEU), in ETSI GS OEU 001.

ITU-T

Within ITU-T SG 13, Recommendation ITU-T Y.3600 was approved in 2015. This Recommendation provides requirements, capabilities and use-cases of cloud computing based big data together with the system context. Cloud computing-based big data provides the capability to collect, store, analyse, visualize and manage varieties of large volume datasets, which cannot be rapidly transferred and analysed using traditional technologies.
More information: http://itu.int/ITU-T/workprog/wp_item.aspx?isn=9853

SG13 published a roadmap for big data standardisation in ITU-T under the name of Y.3600-series Supplement 40 “Big Data Standardisation Roadmap” that includes the standardisation landscape, identification/prioritization of technical areas and possible standardisation activities. Additionally, work is progressing on big data exchange framework and requirements, requirements for data provenance, big data metadata framework and conceptual model, requirements for data integration, data storage federation, data preservation, functional architecture of big data and BDaaS and some aspects of big data-driven networking like requirement of big data-driven networking mobile network traffic management and planning and application of DPI technology.
Work Programme of SG13: http://itu.int/itu-t/workprog/wp_search.aspx?sg=13

More info: https://www.itu.int/en/ITU-T/studygroups/2017-2020/13

ITU-T SG20 “Internet of things (IoT) and smart cities & communities (SC&C)” is studying big data aspects of IoT and SC&C. For example, Recommendation ITU-T Y.4114 “Specific requirements and capabilities of the IoT for big data” complements the developments on common requirements of the IoT described in Recommendation ITU-T Y.4100/Y.2066 and the functional framework and capabilities of the IoT described in Recommendation ITU-T Y.4401/ Y.2068 in terms of the specific requirements and capabilities that the IoT is expected to support in order to address the challenges related to big data. This Recommendation also constitutes a basis for further standardisation work such as functional entities, application programming interfaces (APIs) and protocols concerning big data in the IoT.
Work programme of SG20 is available at: https://www.itu.int/ITU-T/workprog/wp_search.aspx?sg=20

More info: https://itu.int/go/tsg20

The ITU-T Focus Group on Data Processing and Management (FG-DPM) to support IoT and Smart Cities & Communities was set up in 2017. The Focus Group played a role in providing a platform to share views, to develop a series of deliverables, and showcasing initiatives, projects, and standards activities linked to data processing and management and establishment of IoT ecosystem solutions for data focused cities. This Focus Group concluded its work in July 2019 with the development of ten Technical Specifications and 5 Technical reports. The complete list of deliverables is available here https://itu.int/en/ITU-T/focusgroups/dpm

ITU-T SG 17 has approved several standards on big data and open data including “Security requirements and framework for big data analytics in mobile internet services” (Recommendation ITU-T X.1147) and “Data security requirements for the monitoring service of cloud computing” (Recommendation ITU-T X.1603)

SG17 is working on “Security guidelines for Big Data as a Service and Security guidelines of lifecycle management for telecom Big Data” (X.sgtBD) along with standards relating to “Security guidelines for big data infrastructure and platform” (X.sgBDIP), “Guidelines on security of big data as a service (BDaaS)” (X.sgtBD) and “Security-related misbehaviour detection mechanism based on big data analysis for connected vehicles” (X.mdcv).

More info: https://www.itu.int/en/ITU-T/studygroups/2017-2020/17

W3C

DCAT vocabulary (done in the linked government data W3C working group) 

http://www.w3.org/TR/vocab-dcat/

After a successful Workshop on Smart Descriptions & Smarter Vocabularies (SDSVoc) (www.w3.org/2016/11/sdsvoc/) W3C created the Dataset Exchange Working Group (https://www.w3.org/2017/dxwg) to revise DCAT, provide a test suite for content negotiation by application profile and to develop additional relevant vocabularies in response to community demand. 

Work on licence in ODRL continues and has reached a very mature state: https://www.w3.org/TR/odrl-model/ and https://www.w3.org/TR/vocab-odrl/

The Data on the web best practices WG has finished its work successfully (https://www.w3.org/TR/dwbp) also issuing data quality, data usage vocabularies (https://www.w3.org/TR/vocab-dqvhttps://www.w3.org/TR/vocab-duv)

OASIS

The project addresses querying and sharing of data across disparate applications and multiple stakeholders for reuse in enterprise, cloud, and mobile devices. Specification development in the OASIS OData TC builds on the core OData Protocol V4 released in 2014 and addresses additional requirements identified as extensions in four directional white papers: data aggregation, temporal data, JSON documents, and XML documents as streams. OData 4.0 and OData JSON 4.0 have been approved as ISO/IEC 20802-1:2016 and ISO/IEC 20802-2:2016

https://www.oasis-open.org/committees/odata

OpenDocument Format (ODF) is an open, standardised format for reports, office documents and free-form information, fully integrated with other XML systems, and increasingly used as a standard format for publicly-released government information. ODF was originally approved as ISO/IEC 26300:2006, and ODF v1.2 has been approved as ISO/IEC 26300:2015. Link:

https://www.oasis-open.org/committees/office

OASIS XML Localisation Interchange File Format (XLIFF): XLIFF is an XML-based format created to standardize the way in which localizable text, metadata and instructions are passed between tools and services during a localization process.

https://www.oasis-open.org/committees/xliff
ISO/IEC JTC1

In 2018 JTC 1/SC 42 formed, and contains a WG 2 which is responsible for the Big Data work program.

SC 42 has published the following published big data standards:

SC 42 is progressing the following current big data projects, which are expected to complete in the next year:

  • ISO/IEC 20547-1: Information technology — Big Data reference architecture — Part 1: Framework and application process
  • ISO/IEC 20547-3: Information technology — Big Data reference architecture — Part 3: Reference architecture
  • ISO/IEC 24688: Information technology — Artificial Intelligence — Process management framework for Big data analytics
IEEE

IEEE has a series of standards projects related to Big Data (mobile health, energy efficient processing, personal agency and privacy) as well as pre-standardisation activities on Big Data and open data:https://ieeesa.io/rp-open-big-data

CEN CENELEC

CEN/WS (Workshop) ISAEN “Unique Identifier for Personal Data Usage Control in Big Data” seeks to operationalize the bourgeoning policy initiatives related to big data, in particular in relation to personal data management and the protection of individuals’ fundamental rights. It is set against the backdrop of the rapidly expanding digital era of big data. The unique identifier that will be described in the resulting CWA will serve as a measurement tool to empower individuals, help them take control of their data, and make their fundamental right to privacy more actionable.

CEN/WS (Workshop) BDA: This workshop has developed a CWA that aims at defining some technical requirements that will enable innovation in the aquaculture sector, turning the available local and heterogeneous large volumes of data in a universally understandable open repository of data assets. These requirements are the results of the research project AQUASMART of which main objective is to enhance innovation capacity to the aquaculture sector, by addressing the problem of global knowledge access and data exchanges between aquaculture companies and its related stakeholders. CWA 17239 ‘Big Data in Aquaculture’ has been published in January 2018.

Moreover, CEN and CENELEC are cooperating with BDVA (Big Data Value Association), the private counterpart to the EU Commission to implement the BDV PPP programme (Big Data Value PPP).

OGC

The Open Geospatial Consortium (OGC) defines and maintains standards for location-based, spatio-temporal data and services. The work includes, for instance, schema allowing description of spatio-temporal sensor, image, simulation, and statistics data (such as “datacubes”), a modular suite of standards for Web services allowing ingestion, extraction, fusion, and (with the web coverage processing service (WCPS) component standard) analytics of massive spatio-temporal data like satellite and climate archives. OGC also contributes to the INSPIRE project. http://www.opengeospatial.org

Other activities related to standardisation

ISA and ISA2 programme of the European Commission

The DCAT application profile (DCAT-AP) has been defined. DCAT-AP is a specification based on DCAT (a RDF vocabulary designed to facilitate interoperability between data catalogues published on the web) to enable interoperability between data portals, for example to allow metasearches in the European Data Portal that harvests data from national open data portals.

Extensions of the DCAT-AP to spatial (GeoDCAT-AP:  https://joinup.ec.europa.eu/node/139283 ) and statistical information (StatDCAT-AP: https://joinup.ec.europa.eu/asset/stat_dcat_application_profile/home ) have also been developed.

https://joinup.ec.europa.eu/asset/dcat_application_profile/description
https://joinup.ec.europa.eu/asset/dcat_application_profile/asset_release/dcat-ap-v11
CEF

Under the framework of the Connecting Europe Facility programme support to the interoperability of metadata and data at national and EU level is being developed through dedicated calls for proposals. The CEF group is also promoting training and webinars for using the “context broker”, in collaboration as appropriate with the NGSI-LD standards group ETSI ISG CIM.

AquaSmart

AquaSmart enables aquaculture companies to perform data mining at the local level and get actionable results.

The project contributes to standardisation of open data in aquaculture. Results are exploited through the Aquaknowhow business portal. www.aquaknowhow.com

Automat

The main objective of the AutoMat project is to establish a novel and open ecosystem in the form of a cross-border Vehicle Big Data Marketplace that leverages currently unused information gathered from a large amount of vehicles from various brands.

This project has contributed to standardisation of brand-independent vehicle data. www.automat-project.eu

BodyPass

BodyPass aims to break barriers between health sector and consumer goods sector and eliminate the current data silos.

The main objective of BodyPass is to foster exchange, linking and re-use, as well as to integrate 3D data assets from the two sectors. For this, BodyPass adapts and creates tools that allow a secure exchange of information between data owners, companies and subjects (patients and customers).

The project aims at standardizing 3D data

www.bodypass.eu

Odine

The Open Data Incubator for Europe (ODINE) is a 6-month incubator for open data entrepreneurs across Europe. The action is funded with a €7.8m grant from the EU’s Horizon 2020 programme.

Some of the supported SMEs and projects provided contributions to data standardisation. www.opendataincubator.eu 

EU Commission

A smart open data project by DG ENV led directly to the establishment of the Spatial Data on the Web Working group, a collaboration between W3C and the OGC.

G8 Open Data Charter

In 2013, the EU endorsed the G8 Open Data Charter and, with other G8 members, committed to implementing a number of open data activities in the G8 members’ collective action plan (publication of core and high-quality datasets held at EU level, publication of data on the EU open data portal and the sharing of experiences of open data work).

Future Internet Public Private Partnership programme

Specifications developed under the Future Internet public-private-partnership programme (FP7):

FIWARE NGSI extends the OMA Specworks NGSI API for context information management that provides a lightweight and simple means to gather, publish, query and subscribe to context information. FIWARE NGSI can be used for real-time open data management. ETSI’s ISG for cross-cutting Context Information Management (CIM) has developed the NGSI-LD API (GS CIM 004 and GS CIM 009) which builds upon the work done by OMA Specworks and FIWARE. The latest FIWARE software implements the newest ETSI NGSI-LD specification.

FIWARE CKAN: Open Data publication Generic Enabler. FIWARE CKAN is an open source solution for the WG10 publication, management and consumption of open data, usually, but not only, through static datasets. FIWARE CKAN allows its users to catalogue, upload and manage open datasets and data sources. It supports searching, browsing, visualising and accessing open data

Big Data Value cPPP TF6 SG6 on big data standardisation

In the big data value contractual public-private-partnership, a dedicated subgroup (SG6) of Task Force 6: Technical deals with big data standardisation.

Additional information

Existing standards should be checked to take account of the protection of individuals with regard to personal data processing and the free movement of such data in the light of data protection principles. Specific privacy by design standards should be identified and when necessary developed.

The report identified several priorities: 

  • Data access including open data and governance of data within companies (enhanced exploitation, data quality, security): mix the requirements of big data into the existing management standards. The development of a standard regarding data management could be considered.
  • Data transformation, where several elements are identified:
    • criteria and methods for characterising sources and information, in terms of perceived quality and trust in a specific context ;
    • indexing of unstructured data coming from social networks and data associated with mobility and sensors ;
    • Processes and methods of reversibility in pseudonymisation algorithms, evaluation of system performance (ex: Hadoop), NoSQL query language, or visualisation and manipulation process of big data results ;
    • Adapt infrastructures to big data, like cloud computing for storage and massively parallel architectures;
    • Data quality and data identification

 Identifying the use-cases for big data is essential. Highly visible issues for end-users should be addressed: technical interoperability, SLAs, traceability of treatment, data erasure, regulatory compliance, data representation, APIs, etc.

  1. http://eurovoc.europa.eu/drupal/
  2. https://joinup.ec.europa.eu/asset/stat_dcat_application_profile/home
  3. see http://ec.europa.eu/information_society/policy/psi/docs/pdfs/report/final_version_study_psi.docx for an overview and http://ec.europa.eu/information_society/policy/psi/docs/pdfs/opendata2012/open_data_communication/en.pdf 
  4. see https://ec.europa.eu/isa2/solutions/dcat-application-profile-data-portals-europe_en / https://www.fiware.org/
  5. http://eur-lex.europa.eu/legal-content/EN/TXT/?uri=uriserv:OJ.C_.2014.240.01.0001.01.ENG
  6. http://www.bdva.eu/

[18] see http://ec.europa.eu/information_society/policy/psi/docs/pdfs/report/final_version_study_psi.docx for an overview and http://ec.europa.eu/information_society/policy/psi/docs/pdfs/opendata2012/open_data_communication/en.pdf 

[19] see https://ec.europa.eu/isa2/solutions/dcat-application-profile-data-portals-europe_en ; https://www.fiware.org/

[20] http://eur-lex.europa.eu/legal-content/EN/TXT/?uri=uriserv:OJ.C_.2014.240.01.0001.01.ENG

[21] http://eurovoc.europa.eu/drupal/

[22] http://www.bdva.eu/

Original url: https://interoperable-europe.ec.europa.eu/collection/rolling-plan-ict-standardisation/public-sector-information-open-data-and-big-data-rp2020

StandardsGPT

Ask your questions!