The Data Ecosystem includes a collection of data access service via APIs to provide developers with direct access to the data in the ecosystem for retrieval of both data and contextual information related to that data.
- API Gateway
- API Specifications
APIs are secure
We have an obligation to ensure that we protect the intellectual property and the investment of our clients. To ensure that our products comply with accepted secure computing principles, we observe secure development practices and ensure that security is built-in. To ensure the business interests of API providers, APIs are access-controlled.
APIs are open
We cannot know in advance all the ways in which APIs will be used to create value for users. Therefore, we take the approach of "openness by default". All APIs are public unless there is a compelling business reason to do otherwise.
APIs are consistent
APIs are managed such that API users can be confident in reliable, consistent results over time. APIs must be stable and versioned. APIs adhere to a published and well-known life-cycle and strategy for deprecation.
APIs should be easy to use
Cloud-based APIs comprise the platform for new digital services and workflows. To optimize for rapid, hypothesis-driven development, and to gain the biggest benefit of platform integration and collaboration, APIs must ease barriers to use. APIs must be organized and easy to discover. APIs must follow standards (return codes, http methods, REST, OpenAPI). APIs are appropriately documented and behave as expected. They can be tested using automated testing techniques.
API Governance facilitates
API Governance seeks to ensure the benefits of openness. Compliance activities are lightweight, developer-friendly and automated wherever possible. There is a well-known process for resolving disputes. API Governance helps ensure good feedback from, and participation in, the review process.
We have adopted an API gateway (Apigee), to
- Enable self-service for developers and partners
- Provide the ability to create sensible API names without rewriting services
- Provide an entry point for security
- Provide the ability to hide the cloud provider – works with Google, Azure, and Amazon
- Provide the ability to manage rollout, scaling, versioning and rollback
- Provide the ability to monitor API calls (performance, traffic, errors, latency)
- Go to the Developer Portal to see the latest API documentation.
- Alternatively, you can find local documentation on the Wiki at API Specifications.
Data Ecosystem Core services include:
- Entitlements Service is used to enable API and data authorization in the Data Ecosystem as well as for user management.
- Compliance Service is used to ensure the data is legally compliant.
- Storage Service provides a set of APIs to manage the entire metadata life-cycle such as ingestion (persistence), modification, deletion, versioning and data schema. Storage Service is used to ingest metadata information generated by DELFI applications into the Data Ecosystem.
- Search Service provides a model for indexing documents that contain structured data.
- The Data Reference Services include the CRS Catalog Service, the CRS Conversion Service, and the Unit Service. The CRS Catalog Service enables the users to download the entire Coordinate Reference Systems (CRS) catalog for local caching, access to various subsets of the catalog, search for a CRS. The CRS Conversion Service provides spatial reference conversions for coordinates. And the Unit Service provides dimension/measurement and unit definition; given two unit definitions, the service also offers conversion parameters in two different parameterizations.