info@ismena.com
Ismena websiteIsmena websiteIsmena websiteIsmena website
  • Home
  • About us
  • Technologies
    • Cloud Services
      • Google Cloud Platform
        • Networking
        • Compute
        • Storage
        • SAP on GCP
        • Google Maps
        • Data Center Modernization
    • Infrastructure
      • iSolution Services
      • Unified Communication
      • Network Security
      • Access Security & Control
      • Computing Platforms
      • Structured Cabling Infrastructure
      • Datacenter Infrastructure
      • Networking Infrastructure
      • Retail Analytics
      • Cloud Infrastructure
    • Integration
      • Apigee
      • Software AG
      • Custom Connectors
    • Security
      • Security Consulting Services
      • Security Solutions
    • Data & AI
      • BigQuery, Looker
      • Gemini
    • Collaboration Tools
      • Google Workspace For Enterprise
    • ERP-CRM
      • Odoo
      • Salesforce
      • SAP on GCP
    • DevOps
      • GCP
      • SonarSource
    • Managed Service Provider
      • Managed Service Provider
    • App Development
      • App Development
    • Open Banking
      • Open banking
    • Chrome Devices
  • Unplugged Podcast
  • Blog
    • Success Stories
    • News
    • Articles
  • Careers
  • Contact Us

Technologies

Integration

Custom Connectors

Explore All Connectors

KAPSARC Data Portal

KAPSARC Data Portal

Connector Details

Type

Virtual machines, Single VM , BYOL

Runs on

Google Compute Engine

Last Update

24 October, 2024

Category

Business Use

Overview

Documentation

Pricing

Support

Overview

The KAPSARC Data Portal API enables seamless integration with the KAPSARC Data Portal, providing access to energy economics and related datasets for applications such as data analytics, research tools, and visualization platforms. The API supports dataset enumeration, record queries, data exports, and facet-based navigation, advancing energy economics research with datasets like oil databases, consumer price indices, and natural gas data. It is organized around REST, supports only HTTP GET methods, returns JSON responses, and uses the Opendatasoft Query Language (ODSQL) for querying.

Integration Overview

This document outlines each integration point, its purpose, configuration, and supported workflows using the KAPSARC Data Portal API. Free registration is required for full access, including advanced filters, downloadable data, and API authentication via an API key.

Supported Integration Action Points

  • Catalog Datasets Retrieval: Retrieves a list of all datasets in the catalog.
  • Catalog Exports Listing: Lists available export formats for the dataset catalog.
  • Catalog Export by Format: Exports the dataset catalog in a specified format (e.g., CSV, JSON).
  • Catalog CSV Export: Exports the dataset catalog as CSV with customizable parameters.
  • Catalog DCAT Export: Exports catalog metadata in DCAT-AP format (RDF/XML).
  • Catalog Facets Retrieval: Retrieves global catalog-level facets for navigation.
  • Dataset Metadata Retrieval: Retrieves metadata for a specific dataset.
  • Dataset Records Retrieval: Retrieves records for a specific dataset.
  • Dataset Record Retrieval: Retrieves a specific record from a dataset.
  • Dataset Exports Listing: Lists supported export formats for a dataset.
  • Dataset Export by Format: Exports a dataset in a specific format (e.g., CSV, JSON).
  • Dataset CSV Export: Exports a dataset in CSV format with specific parameters.
  • Dataset Parquet Export: Exports a dataset in Parquet format.
  • Dataset GPX Export: Exports a dataset in GPX format for geographic data.
  • Dataset Facets Retrieval: Retrieves available facets for a dataset.
  • Dataset Attachments Retrieval: Retrieves files or attachments related to a dataset.

Detailed Integration Documentation

Catalog Datasets Retrieval

Action

getDatasets

Purpose

Retrieves a comprehensive list of all datasets in the catalog, serving as the entry point for exploring data like Saudi Arabia Oil Database or Consumer Price Index datasets.

Parameters

  • Required: None.
  • Optional:
    • select: Specify fields to include (e.g., dataset_id, title).
    • where: Filter datasets using ODSQL (e.g., publisher=”KAPSARC”).
    • order_by: Sort results (e.g., modified desc).
    • limit: Number of items (default: 10, max: 100).
    • offset: Starting index (default: 0).
    • refine: Filter by facet (e.g., theme:Energy).
    • exclude: Exclude facet values (e.g., theme:Energy).
    • lang: Language for formatting (e.g., en).
    • timezone: Timezone for datetime fields (e.g., Asia/Riyadh).
    • group_by: Group results (e.g., publisher).
    • include_links: Include HATEOAS links (boolean, default: false).
    • include_app_metas: Include metadata (boolean, default: false).

Configuration

Requires registration and API key setup for authentication. Ensure environment variables are configured for connectivity.

Output

  • Successful: Returns a JSON object with total_count, _links, and results (array of dataset objects with dataset_id, title, metas, etc.).
  • Failure:Returns error details (e.g., unauthorized access, invalid ODSQL query).

Workflow Example

1. Configure the API with an API key.
2. Execute the getDatasets operation to fetch datasets.
3. Process the response to identify datasets (e.g., saudi-arabia-oil-database) for further actions.

Catalog Exports Listing

Operation ID

listExportFormats

Purpose

Lists available export formats for the dataset catalog, enabling users to select formats like CSV or JSON.

Parameters

  • Required: None.
  • Optional: None.

Configuration

Requires API key authentication

Output

  • Successful: Returns a JSON object with a links array listing export formats (e.g., csv, json).
  • Failure: Returns error details (e.g., authentication error).

Workflow Example

1. Execute the listExportFormats operation to retrieve export formats.
2. Review formats (e.g., csv) to proceed with catalog export.
3. Use the exportDatasets operation for export.

Catalog Export by Format

Operation ID

exportDatasets

Purpose

Exports the entire dataset catalog in a specified format (e.g., JSON, CSV) for research or analysis.

Parameters

  • Required: format (e.g., csv, json, xlsx, rdf).
  • Optional:
    • select: Specify fields to include (e.g., dataset_id, title).
    • where: Filter datasets using ODSQL (e.g., publisher=”KAPSARC”).
    • order_by: Sort results (e.g., modified desc).
    • group_by: Group results (e.g., publisher).
    • limit: Number of items (default: -1, retrieves all).
    • offset: Starting index (default: 0).
    • refine: Filter by facet (e.g., theme:Energy).
    • exclude: Exclude facet values (e.g., theme:Energy).
    • lang: Language for formatting (e.g., en).
    • timezone: Timezone for datetime fields (e.g., Asia/Riyadh).

Configuration

Requires API key authentication.

Output

  • Successful: Returns a file in the specified format.
  • Failure: Returns error details (e.g., unsupported format).

Workflow Example

1. Use the listExportFormats operation to identify supported formats.
2. Execute the exportDatasets operation with format=csv and limit=10.
3. Save the CSV file for analysis.

Catalog CSV Export

Operation ID

exportCatalogCSV

Purpose

Exports the dataset catalog in CSV format with customizable parameters for tailored data extraction.

Parameters

  • Required: None.
  • Optional:
    • delimiter: Field delimiter (e.g., ,, default: ;).
    • list_separator: Multivalued string separator (default: ,).
    • quote_all: Quote all strings (boolean, default: false).
    • with_bom: Include Unicode BOM (boolean, default: true).
    • select: Specify fields to include (e.g., dataset_id, title).
    • where: Filter datasets using ODSQL (e.g., publisher=”KAPSARC”).
    • order_by: Sort results (e.g., modified desc).
    • group_by: Group results (e.g., publisher).
    • limit: Number of items (default: -1, retrieves all).
    • offset: Starting index (default: 0).
    • refine: Filter by facet (e.g., theme:Energy).
    • exclude: Exclude facet values (e.g., theme:Energy).
    • lang: Language for formatting (e.g., en).
    • timezone: Timezone for datetime fields (e.g., Asia/Riyadh).

Configuration

Requires API key authentication.

Output

  • Successful: Returns a CSV file.
  • Failure: Returns error details (e.g., invalid parameters).

Workflow Example

1. Execute the exportCatalogCSV operation with delimiter=,.
2. Save the CSV file containing catalog metadata.
3. Use the file for research or reporting.

Catalog DCAT Export

Operation ID

exportCatalogDCAT

Purpose

Exports catalog metadata in RDF/XML (DCAT-AP) format for integration with metadata systems

Parameters

  • Required:dcat_ap_format (e.g., _ap_ch, _ap_de).
  • Optional:
    • include_exports: Export formats to include (e.g., csv,json).
    • use_labels_in_exports: Use field labels (boolean, default: true).

Configuration

Requires API key authentication.

Output

  • Successful: Returns an RDF/XML file.
  • Failure: Returns error details (e.g., invalid DCAT format).

Workflow Example

1. Execute the exportCatalogDCAT operation with dcat_ap_format=_ap_ch.
2. Save the RDF/XML file.
3. Integrate with DCAT-compatible systems.

Catalog Facets Retrieval

Operation ID

getDatasetsFacets

Purpose

Retrieves facet values for datasets (e.g., theme, publisher) to aid navigation and filtering.

Parameters

  • Required: None.
  • Optional:
    • facet: Specify facet to retrieve (e.g., theme).
    • where: Filter datasets using ODSQL (e.g., publisher=”KAPSARC”).
    • refine: Filter by facet (e.g., theme:Energy).
    • exclude: Exclude facet values (e.g., theme:Energy).
    • timezone: Timezone for datetime fields (e.g., Asia/Riyadh).

Configuration

Requires API key authentication.

Output

  • Successful:Returns a JSON object with links and facets arrays.
  • Failure: Returns error details (e.g., invalid facet).

Workflow Example

1. Execute the getDatasetsFacets operation with facet=theme.
2. Use facet values (e.g., Energy) to refine dataset queries.
3. Filter results in subsequent operations.

Dataset Metadata Retrieval

Operation ID

getDataset

Purpose

Retrieves detailed metadata for a specific dataset (e.g., fields, endpoints) to plan further queries or exports.

Parameters

  • Required: dataset_id (e.g., saudi-arabia-oil-database).
  • Optional:
    • select: Specify fields to include (e.g., dataset_id, title).
    • lang: Language for formatting (e.g., en).
    • timezone: Timezone for datetime fields (e.g., Asia/Riyadh).
    • include_links: Include HATEOAS links (boolean, default: false).
    • include_app_metas: Include metadata (boolean, default: false).

Configuration

Requires API key authentication.

Output

  • Successful: Returns a JSON object with dataset details (e.g., dataset_id, title, metas, fields).
  • Failure:Returns error details (e.g., invalid dataset_id).

Workflow Example

1. Use the getDatasets operation to identify a dataset (e.g., saudi-arabia-oil-database).
2. Execute the getDataset operation with dataset_id.
3. Review metadata for further actions.

Dataset Records Retrieval

Operation ID

getRecords

Purpose

Queries records from a specific dataset for detailed data analysis (e.g., oil production data).

Parameters

  • Required:dataset_id.
  • Optional:
    • select: Specify fields to include (e.g., year, production).
    • where: Filter records using ODSQL (e.g., year=2020).
    • group_by: Group results (e.g., region).
    • order_by: Sort results (e.g., production desc).
    • limit: Number of items (default: 10, max: 100 without group_by, 20000 with group_by).
    • offset: Starting index (default: 0).
    • refine: Filter by facet (e.g., year:2020).
    • exclude: Exclude facet values (e.g., year:2020).
    • lang: Language for formatting (e.g., en).
    • timezone: Timezone for datetime fields (e.g., Asia/Riyadh).
    • include_links: Include HATEOAS links (boolean, default: false).
    • include_app_metas: Include metadata (boolean, default: false).

Configuration

Requires API key authentication.

Output

  • Successful:Returns a JSON object with total_count, _links, and results (array of records).
  • Failure:Returns error details (e.g., invalid dataset_id).

Workflow Example

1. Use the getDataset operation to select a dataset (e.g., saudi-arabia-oil-database).
2. Execute the getRecords operation with dataset_id and where=year:2020.
3. Process the response for analysis or visualization.

Dataset Record Retrieval

Operation ID

getRecord

Purpose

Retrieves a single record from a dataset for detailed data inspection.

Parameters

  • Required:dataset_id, record_id (e.g., oil_001).
  • Optional:
    • select: Specify fields to include (e.g., year, production).
    • lang: Language for formatting (e.g., en).
    • timezone: Timezone for datetime fields (e.g., Asia/Riyadh).

Configuration

Requires API key authentication.

Output

  • Successful: Returns a JSON object representing the record.
  • Failure: Returns error details (e.g., invalid record_id).

Workflow Example

1. Use the getRecords operation to identify a record in saudi-arabia-oil-database.
2. Execute the getRecord operation with dataset_id and record_id=oil_001.
3. Review the record details for processing.

Dataset Exports Listing

Operation ID

listDatasetExportFormats

Purpose

Lists available export formats for a specific dataset, aiding format selection for export.

Parameters

  • Required:dataset_id.
  • Optional:
    • None.

Configuration

Requires API key authentication.

Output

  • Successful: Returns a JSON object with a links array listing export formats (e.g., csv, json, parquet).
  • Failure: Returns error details (e.g., invalid dataset_id).

Workflow Example

1. Use the getDataset operation to select a dataset (e.g., saudi-arabia-oil-database).
2. Execute the listDatasetExportFormats operation with dataset_id.
3. Review formats (e.g., csv, json) for export.

Dataset Export by Format

Operation ID

 exportRecords

Purpose

Exports a dataset in a specified format (e.g., CSV, JSON, Parquet) for diverse applications.

Parameters

  • Required: dataset_id, format (e.g., csv, json, parquet)..
  • Optional:
    • select: Specify fields to include (e.g., year, production).
    • where: Filter records using ODSQL (e.g., year=2020).
    • order_by: Sort results (e.g., production desc).
    • group_by: Group results (e.g., region).
    • limit: Number of items (default: -1, retrieves all).
    • refine: Filter by facet (e.g., year:2020).
    • exclude: Exclude facet values (e.g., year:2020).
    • lang: Language for formatting (e.g., en).
    • timezone: Timezone for datetime fields (e.g., Asia/Riyadh).
    • use_labels: Use field labels (boolean, default: false).
    • epsg: Coordinate system for geospatial data (e.g., 4326).

Configuration

Requires API key authentication.

Output

  • Successful: Returns a file in the specified format.
  • Failure: Returns error details (e.g., invalid format).

Workflow Example

1. Use the listDatasetExportFormats operation to identify formats for saudi-arabia-oil-database.
2. Execute the exportRecords operation with dataset_id and format=csv.
3. Save the exported file for use.

Dataset CSV Export

Operation ID

exportRecordsCSV

Purpose

Exports a dataset in CSV format with customizable parameters for tailored data extraction.

Parameters

  • Required:dataset_id.
  • Optional:
    • delimiter: Field delimiter (e.g., ,, default: ;).
    • list_separator: Multivalued string separator (default: ,).
    • quote_all: Quote all strings (boolean, default: false).
    • with_bom: Include Unicode BOM (boolean, default: true).
    • select: Specify fields to include (e.g., year, production).
    • where: Filter records using ODSQL (e.g., year=2020).
    • order_by: Sort results (e.g., production desc).
    • group_by: Group results (e.g., region).
    • limit: Number of items (default: -1, retrieves all).
    • refine: Filter by facet (e.g., year:2020).
    • exclude: Exclude facet values (e.g., year:2020).
    • lang: Language for formatting (e.g., en).
    • timezone: Timezone for datetime fields (e.g., Asia/Riyadh).
    • use_labels: Use field labels (boolean, default: false).

Configuration

Requires API key authentication.

Output

  • Successful:Returns a CSV file.
  • Failure:Returns error details (e.g., invalid parameters).

Workflow Example

1. Execute the exportRecordsCSV operation with dataset_id=saudi-arabia-oil-database and delimiter=,.
2. Save the CSV file containing dataset records.
3. Use the file for analysis or reporting.

Dataset Parquet Export

Operation ID

exportRecordsParquet

Purpose

Exports a dataset in Parquet format, ideal for efficient storage and processing in analytics platforms.

Parameters

  • Required:dataset_id.
  • Optional:
    • parquet_compression: Compression type (snappy or zstd, default: snappy).
    • select: Specify fields to include (e.g., year, production).
    • where: Filter records using ODSQL (e.g., year=2020).
    • order_by: Sort results (e.g., production desc).
    • group_by: Group results (e.g., region).
    • limit: Number of items (default: -1, retrieves all).
    • refine: Filter by facet (e.g., year:2020).
    • exclude: Exclude facet values (e.g., year:2020).
    • lang: Language for formatting (e.g., en).
    • timezone: Timezone for datetime fields (e.g., Asia/Riyadh).
    • use_labels: Use field labels (boolean, default: false).

Configuration

Requires API key authentication.

Output

  • Successful:Returns a Parquet file.
  • Failure:Returns error details (e.g., invalid dataset_id).

Workflow Example

1. Execute the exportRecordsParquet operation with dataset_id=saudi-arabia-oil-database.
2. Save the Parquet file.
3. Use the file for data processing in compatible systems.

Dataset GPX Export

Operation ID

exportRecordsGPX

Purpose

Exports a dataset in GPX format, suitable for geographic data visualization and GPS applications.

Parameters

  • Required:dataset_id.
  • Optional:
    • name_field: Field for waypoint names (e.g., region).
    • description_field_list: Fields for waypoint descriptions (e.g., production).
    • use_extension: Include extensions in GPX (boolean, default: true).
    • select: Specify fields to include (e.g., year, production).
    • where: Filter records using ODSQL (e.g., year=2020).
    • order_by: Sort results (e.g., production desc).
    • group_by: Group results (e.g., region).
    • limit: Number of items (default: -1, retrieves all).
    • refine: Filter by facet (e.g., year:2020).
    • exclude: Exclude facet values (e.g., year:2020).
    • lang: Language for formatting (e.g., en).
    • timezone: Timezone for datetime fields (e.g., Asia/Riyadh).
    • use_labels: Use field labels (boolean, default: false).
    • epsg: Coordinate system for geospatial data (e.g., 4326).

Configuration

 Requires API key authentication.

Output

  • Successful:Returns a GPX file.
  • Failure:Returns error details (e.g., invalid parameters).

Workflow Example

1. Execute the exportRecordsGPX operation with dataset_id=saudi-arabia-oil-database and name_field=region.
2. Save the GPX file.
3. Use the file for geographic data visualization.

Dataset Facets Retrieval

Operation ID

getRecordsFacets

Purpose

Retrieves facet values for records in a specific dataset, aiding in filtering and navigation.

Parameters

  • Required:dataset_id.
  • Optional:
    • facet: Specify facet to retrieve (e.g., year).
    • where: Filter records using ODSQL (e.g., year=2020).
    • refine: Filter by facet (e.g., year:2020).
    • exclude: Exclude facet values (e.g., year:2020).
    • lang: Language for formatting (e.g., en).
    • timezone: Timezone for datetime fields (e.g., Asia/Riyadh).

Configuration

Requires API key authentication

Output

  • Successful:Returns a JSON object with links and facets arrays.
  • Failure:Returns error details (e.g., invalid facet).

Workflow Example

1. Use the getDataset operation to select saudi-arabia-oil-database.
2. Execute the getRecordsFacets operation with dataset_id and facet=year.
3. Use facet values to refine record queries.

Dataset Attachments Retrieval

Operation ID

getDatasetAttachments

Purpose

Retrieves files or attachments related to a specific dataset, providing access to supplementary resources.

Parameters

  • Required:dataset_id.
  • Optional:
    • None.

Configuration

Requires API key authentication.

Output

  • Successful:Returns a JSON object with links and attachments arrays (e.g., href, mime-type, title).
  • Failure:Returns error details (e.g., invalid dataset_id).

Workflow Example

1. Use the getDataset operation to select saudi-arabia-oil-database.
2. Execute the getDatasetAttachments operation with dataset_id.
3. Download attachments (e.g., oil_report.pdf) for further use.

Workflow Creation with the API

Example Workflow: Exploring and Exporting Oil Production Data

Retrieve Catalog Datasets:

  • Use the getDatasets operation to fetch available datasets.
  • Identify the target dataset (e.g., saudi-arabia-oil-database).

Refine Dataset Exploration:

Execute the getRecordsFacets operation with dataset_id=saudi-arabia-oil-database and facet=year to filter records

Query Dataset Records:

Use the getRecords operation with dataset_id=saudi-arabia-oil-database and where=year:2020 to fetch records.

Export Dataset Data:

  • Export in CSV using the exportRecordsCSV operation with dataset_id=saudi-arabia-oil-database.
  • Export in Parquet using the exportRecordsParquet operation for analytical processing.
  • Generate a GPX file for geographic visualization using the exportRecordsGPX operation with name_field=region.

Retrieve Metadata and Attachments:

  • Fetch metadata using the getDataset operation for saudi-arabia-oil-database.
  • Download attachments (e.g., reports) via the getDatasetAttachments operation.

Export Catalog Metadata

  • Export the catalog as CSV using the exportCatalogCSV operation.
  • Generate DCAT-AP metadata via the exportCatalogDCAT operation with dcat_ap_format=_ap_ch.

Pricing

Request a Quote

Support

For Technical support please contact us on

custom-connectors-support@isolutions.sa

iSolution logo - white - transparent 250 px

iSolution logo - white - transparent 250 px

A tech solution company dedicated to providing innovation thus empowering businesses to thrive in the digital age.

  • Home
  • About us
  • Blog
  • Careers
  • Success Stories
  • News
  • Articles
  • Contact Us
  • Terms and conditions
  • Privacy Policy
© Copyright 2024 iSolution | All Rights Reserved
  • Home
  • About us
  • Technologies
    • Cloud Services
      • Google Cloud Platform
        • Networking
        • Compute
        • Storage
        • SAP on GCP
        • Google Maps
        • Data Center Modernization
    • Infrastructure
      • iSolution Services
      • Unified Communication
      • Network Security
      • Access Security & Control
      • Computing Platforms
      • Structured Cabling Infrastructure
      • Datacenter Infrastructure
      • Networking Infrastructure
      • Retail Analytics
      • Cloud Infrastructure
    • Integration
      • Apigee
      • Software AG
      • Custom Connectors
    • Security
      • Security Consulting Services
      • Security Solutions
    • Data & AI
      • BigQuery, Looker
      • Gemini
    • Collaboration Tools
      • Google Workspace For Enterprise
    • ERP-CRM
      • Odoo
      • Salesforce
      • SAP on GCP
    • DevOps
      • GCP
      • SonarSource
    • Managed Service Provider
      • Managed Service Provider
    • App Development
      • App Development
    • Open Banking
      • Open banking
    • Chrome Devices
  • Unplugged Podcast
  • Blog
    • Success Stories
    • News
    • Articles
  • Careers
  • Contact Us
Ismena website

Register to Sonar Dubai

Sonar Dubai

Register To The Future Fabric Event

Register to Gemini in Action Workshop

[forminator_form id=”14485″]

Registration To Amman Unplugged Event

[forminator_form id=”14419″]

Register to Gemini in Action Workshop

[forminator_form id=”14298″]

Tech and Culture Riyadh

[forminator_form id=”13094″]