Commit 44a1e5f8 authored by Cyril Monmouton's avatar Cyril Monmouton
Browse files

Merge branch 'master' of...

Merge branch 'master' of https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/wellbore/wellbore-domain-services into feature/explicit-error-not-str-columns

# Conflicts:
#	tests/unit/routers/chunking_test.py
parents 5b55fdad a157c10b
Pipeline #50124 failed with stages
in 4 minutes and 27 seconds
......@@ -12,7 +12,6 @@ The following software have components provided under the terms of this license:
- boto3 (from https://github.com/boto/boto3)
- botocore (from https://github.com/boto/botocore)
- coverage (from https://coverage.readthedocs.io)
- cryptography (from https://github.com/pyca/cryptography)
- google-api-core (from https://github.com/GoogleCloudPlatform/google-cloud-python)
- google-auth (from https://github.com/GoogleCloudPlatform/google-auth-library-python)
- google-auth-oauthlib (from https://github.com/GoogleCloudPlatform/google-auth-library-python-oauthlib)
......@@ -56,14 +55,14 @@ BSD-2-Clause
========================================================================
The following software have components provided under the terms of this license:
- decorator (from https://github.com/micheles/decorator)
- grpcio (from http://www.grpc.io)
- locket (from http://github.com/mwilliamson/locket.py)
- mock (from https://github.com/testing-cabal/mock)
- numpy (from http://www.numpy.org)
- ply (from http://www.dabeaz.com/ply/)
- packaging (from https://github.com/pypa/packaging)
- pyasn1 (from http://sourceforge.net/projects/pyasn1/)
- pyasn1-modules (from http://sourceforge.net/projects/pyasn1/)
- pycparser (from https://github.com/eliben/pycparser)
- tblib (from https://github.com/ionelmc/python-tblib)
========================================================================
......@@ -77,7 +76,6 @@ The following software have components provided under the terms of this license:
- click (from http://github.com/mitsuhiko/click)
- cloudpickle (from https://github.com/cloudpipe/cloudpickle)
- colorama (from https://github.com/tartley/colorama)
- cryptography (from https://github.com/pyca/cryptography)
- dask (from http://github.com/dask/dask/)
- decorator (from https://github.com/micheles/decorator)
- distributed (from https://distributed.readthedocs.io/en/latest/)
......@@ -89,26 +87,20 @@ The following software have components provided under the terms of this license:
- httpx (from https://github.com/encode/httpx)
- idna (from https://github.com/kjd/idna)
- isodate (from http://cheeseshop.python.org/pypi/isodate)
- locket (from http://github.com/mwilliamson/locket.py)
- mock (from https://github.com/testing-cabal/mock)
- numpy (from http://www.numpy.org)
- oauthlib (from https://github.com/idan/oauthlib)
- packaging (from https://github.com/pypa/packaging)
- pandas (from http://pandas.pydata.org)
- partd (from http://github.com/dask/partd/)
- ply (from http://www.dabeaz.com/ply/)
- protobuf (from https://developers.google.com/protocol-buffers/)
- psutil (from https://github.com/giampaolo/psutil)
- pyarrow (from https://arrow.apache.org/)
- pyasn1 (from http://sourceforge.net/projects/pyasn1/)
- pyasn1-modules (from http://sourceforge.net/projects/pyasn1/)
- pycparser (from https://github.com/eliben/pycparser)
- pyparsing (from http://pyparsing.wikispaces.com/)
- pyrsistent (from http://github.com/tobgu/pyrsistent/)
- python-dateutil (from https://dateutil.readthedocs.org)
- python-rapidjson (from https://github.com/python-rapidjson/python-rapidjson)
- requests-oauthlib (from https://github.com/requests/requests-oauthlib)
- starlette (from https://github.com/encode/starlette)
- tblib (from https://github.com/ionelmc/python-tblib)
- toolz (from http://github.com/pytoolz/toolz/)
- uvicorn (from https://github.com/tomchristie/uvicorn)
- zict (from http://github.com/dask/zict/)
......@@ -119,35 +111,33 @@ CC-BY-4.0
The following software have components provided under the terms of this license:
- adlfs (from https://github.com/hayesgb/adlfs/)
- dask (from http://github.com/dask/dask/)
- distributed (from https://distributed.readthedocs.io/en/latest/)
- fsspec (from http://github.com/intake/filesystem_spec)
- gcsfs (from https://github.com/dask/gcsfs)
- pandas (from http://pandas.pydata.org)
- partd (from http://github.com/dask/partd/)
- toolz (from http://github.com/pytoolz/toolz/)
========================================================================
CC-BY-SA-3.0
CC0-1.0
========================================================================
The following software have components provided under the terms of this license:
- dask (from http://github.com/dask/dask/)
- distributed (from https://distributed.readthedocs.io/en/latest/)
- numpy (from http://www.numpy.org)
========================================================================
CNRI-Python
DOC
========================================================================
The following software have components provided under the terms of this license:
- isodate (from http://cheeseshop.python.org/pypi/isodate)
- ply (from http://www.dabeaz.com/ply/)
- dask (from http://github.com/dask/dask/)
- distributed (from https://distributed.readthedocs.io/en/latest/)
- numpy (from http://www.numpy.org)
========================================================================
GPL-2.0-only
========================================================================
The following software have components provided under the terms of this license:
- coverage (from https://coverage.readthedocs.io)
- grpcio (from http://www.grpc.io)
========================================================================
......@@ -156,22 +146,28 @@ GPL-2.0-or-later
The following software have components provided under the terms of this license:
- grpcio (from http://www.grpc.io)
- pyparsing (from http://pyparsing.wikispaces.com/)
========================================================================
GPL-3.0-only
========================================================================
The following software have components provided under the terms of this license:
- coverage (from https://coverage.readthedocs.io)
- grpcio (from http://www.grpc.io)
========================================================================
GPL-3.0-or-later
========================================================================
The following software have components provided under the terms of this license:
- pyparsing (from http://pyparsing.wikispaces.com/)
- rfc3986 (from https://rfc3986.readthedocs.org)
========================================================================
ISC
========================================================================
The following software have components provided under the terms of this license:
- click (from http://github.com/mitsuhiko/click)
- grpcio (from http://www.grpc.io)
- requests-oauthlib (from https://github.com/requests/requests-oauthlib)
......@@ -196,21 +192,6 @@ The following software have components provided under the terms of this license:
- chardet (from https://github.com/chardet/chardet)
========================================================================
LGPL-2.1-or-later
========================================================================
The following software have components provided under the terms of this license:
- chardet (from https://github.com/chardet/chardet)
========================================================================
LGPL-3.0-only
========================================================================
The following software have components provided under the terms of this license:
- chardet (from https://github.com/chardet/chardet)
- pycparser (from https://github.com/eliben/pycparser)
========================================================================
MIT
========================================================================
......@@ -237,6 +218,7 @@ The following software have components provided under the terms of this license:
- cachetools (from https://github.com/tkem/cachetools)
- cffi (from http://cffi.readthedocs.org)
- coverage (from https://coverage.readthedocs.io)
- distributed (from https://distributed.readthedocs.io/en/latest/)
- fastapi (from https://github.com/tiangolo/fastapi)
- grpcio (from http://www.grpc.io)
- h11 (from https://github.com/python-hyper/h11)
......@@ -244,7 +226,6 @@ The following software have components provided under the terms of this license:
- jmespath (from https://github.com/jmespath/jmespath.py)
- jsonschema (from http://github.com/Julian/jsonschema)
- msal (from https://github.com/AzureAD/microsoft-authentication-library-for-python)
- msal-extensions (from https://pypi.org/project/msal-extensions/0.1.3/)
- msrest (from https://github.com/Azure/msrest-for-python)
- munch (from http://github.com/Infinidat/munch)
- numpy (from http://www.numpy.org)
......@@ -262,7 +243,6 @@ The following software have components provided under the terms of this license:
- python-rapidjson (from https://github.com/python-rapidjson/python-rapidjson)
- python-ulid (from https://github.com/mdomke/python-ulid)
- pytz (from http://pythonhosted.org/pytz)
- requests-oauthlib (from https://github.com/requests/requests-oauthlib)
- six (from http://pypi.python.org/pypi/six/)
- sniffio (from https://github.com/python-trio/sniffio)
- structlog (from http://www.structlog.org/)
......@@ -272,21 +252,21 @@ The following software have components provided under the terms of this license:
- zipp (from https://github.com/jaraco/zipp)
========================================================================
MPL-2.0
MIT-CMU
========================================================================
The following software have components provided under the terms of this license:
- certifi (from http://certifi.io/)
- pyparsing (from http://pyparsing.wikispaces.com/)
========================================================================
NCSA
MPL-2.0
========================================================================
The following software have components provided under the terms of this license:
- numpy (from http://www.numpy.org)
- certifi (from http://certifi.io/)
========================================================================
OPL-1.0
NCSA
========================================================================
The following software have components provided under the terms of this license:
......@@ -305,16 +285,9 @@ Python-2.0
The following software have components provided under the terms of this license:
- async-timeout (from https://github.com/aio-libs/async_timeout/)
- distributed (from https://distributed.readthedocs.io/en/latest/)
- google-auth (from https://github.com/GoogleCloudPlatform/google-auth-library-python)
- google-auth-oauthlib (from https://github.com/GoogleCloudPlatform/google-auth-library-python-oauthlib)
- numpy (from http://www.numpy.org)
- pandas (from http://pandas.pydata.org)
- portalocker (from https://github.com/WoLpH/portalocker)
- python-dateutil (from https://dateutil.readthedocs.org)
- pytz (from http://pythonhosted.org/pytz)
- rsa (from https://stuvel.eu/rsa)
- sniffio (from https://github.com/python-trio/sniffio)
- typing-extensions (from https://github.com/python/typing)
- urllib3 (from https://urllib3.readthedocs.io/)
......@@ -339,20 +312,12 @@ The following software have components provided under the terms of this license:
- jsonpath-ng (from https://github.com/h2non/jsonpath-ng)
========================================================================
ZPL-2.1
========================================================================
The following software have components provided under the terms of this license:
- pytz (from http://pythonhosted.org/pytz)
========================================================================
Zlib
========================================================================
The following software have components provided under the terms of this license:
- grpcio (from http://www.grpc.io)
- numpy (from http://www.numpy.org)
========================================================================
public-domain
......@@ -361,9 +326,7 @@ The following software have components provided under the terms of this license:
- botocore (from https://github.com/boto/botocore)
- grpcio (from http://www.grpc.io)
- numpy (from http://www.numpy.org)
- pandas (from http://pandas.pydata.org)
- py (from http://pylib.readthedocs.org/)
- pytz (from http://pythonhosted.org/pytz)
......@@ -13,7 +13,6 @@
# limitations under the License.
import asyncio
import base64
import hashlib
import json
import time
......@@ -73,6 +72,7 @@ class DefaultWorkerPlugin(WorkerPlugin):
exc = self.worker.exceptions[key]
getLogger().exception("Task '%s' has failed with exception: %s" % (key, str(exc)))
class DaskBulkStorage:
client = None
""" Dask client """
......@@ -124,23 +124,21 @@ class DaskBulkStorage:
await DaskBulkStorage.client.close() # or shutdown
DaskBulkStorage.client = None
@staticmethod
def _encode_record_id(record_id: str) -> str:
record_id_b64 = base64.b64encode(record_id.encode()).decode()
return record_id_b64.rstrip('=') # remove padding chars ('=')
def _encode_record_id(self, record_id: str) -> str:
return hashlib.sha1(record_id.encode()).hexdigest()
def _get_base_directory(self, protocol=True):
return f'{self.protocol}://{self.base_directory}' if protocol else self.base_directory
def _get_blob_path(self, record_id: str, bulk_id: str, with_protocol=True) -> str:
"""Return the bulk path from the bulk_id."""
record_id_b64 = self._encode_record_id(record_id)
return f'{self._get_base_directory(with_protocol)}/{record_id_b64}/bulk/{bulk_id}/data'
encoded_id = self._encode_record_id(record_id)
return f'{self._get_base_directory(with_protocol)}/{encoded_id}/bulk/{bulk_id}/data'
def _build_path_from_session(self, session: Session, with_protocol=True) -> str:
"""Return the session path."""
record_id_b64 = self._encode_record_id(session.recordId)
return f'{self._get_base_directory(with_protocol)}/{record_id_b64}/session/{session.id}/data'
encoded_id = self._encode_record_id(session.recordId)
return f'{self._get_base_directory(with_protocol)}/{encoded_id}/session/{session.id}/data'
def _load(self, path, **kwargs) -> dd.DataFrame:
"""Read a Parquet file into a Dask DataFrame
......@@ -172,9 +170,9 @@ class DaskBulkStorage:
returns a Future<None>
Note:
we should be able to change or support other format easily ?
schema={} instead of 'infer' fixes wrong inference for columns of type string starting with nan values
"""
return self.client.submit(dd.to_parquet, ddf, path, schema="infer",
engine='pyarrow',
return self.client.submit(dd.to_parquet, ddf, path, schema={}, engine='pyarrow',
storage_options=self._parameters.storage_options)
def _save_with_pandas(self, path, pdf: dd.DataFrame):
......@@ -230,7 +228,7 @@ class DaskBulkStorage:
if isinstance(pdf.index, pd.DatetimeIndex):
first_idx, last_idx = pdf.index[0].value, pdf.index[-1].value
idx_range = f'{first_idx}_{last_idx}'
shape = hashlib.sha256('_'.join(map(str, pdf)).encode()).hexdigest()
shape = hashlib.sha1('_'.join(map(str, pdf)).encode()).hexdigest()
t = round(time.time() * 1000)
filename = f'{idx_range}_{t}.{shape}'
......
......@@ -21,7 +21,7 @@ from fastapi import APIRouter, Depends, HTTPException, Request, status
from fastapi.responses import Response
import pandas as pd
from app.bulk_persistence import DataframeSerializerAsync, JSONOrient
from app.bulk_persistence import DataframeSerializerAsync, JSONOrient, get_dataframe
from app.bulk_persistence.bulk_id import BulkId
from app.bulk_persistence.dask.dask_bulk_storage import DaskBulkStorage
from app.bulk_persistence.dask.errors import BulkError, BulkNotFound
......@@ -29,6 +29,7 @@ from app.clients.storage_service_client import get_storage_record_service
from app.record_utils import fetch_record
from app.bulk_persistence.mime_types import MimeTypes
from app.model.model_chunking import GetDataParams
from app.model.log_bulk import LogBulkHelper
from app.utils import Context, OpenApiHandler, get_ctx
from app.persistence.sessions_storage import (Session, SessionException, SessionState, SessionUpdateMode)
from app.routers.common_parameters import (
......@@ -169,7 +170,7 @@ class DataFrameRender:
return Response(pdf.to_parquet(engine="pyarrow"), media_type=MimeTypes.PARQUET.type)
def get_bulk_uri(record):
def get_bulk_uri_osdu(record):
return record.data.get('ExtensionProperties', {}).get('wdms', {}).get(BULK_URI_FIELD, None)
......@@ -278,16 +279,21 @@ async def get_data_version(
dask_blob_storage: DaskBulkStorage = Depends(with_dask_blob_storage),
):
record = await fetch_record(ctx, record_id, version)
bulk_uri = get_bulk_uri(record)
bulk_urn = get_bulk_uri_osdu(record)
if bulk_urn is not None:
bulk_id, prefix = BulkId.bulk_urn_decode(bulk_urn)
else: # fallback on ddms_v2 Persistence for wks:log schema
bulk_id, prefix = LogBulkHelper.get_bulk_id(record, None)
try:
if bulk_uri is None:
if bulk_id is None:
raise BulkNotFound(record_id=record_id, bulk_id=None)
bulk_id, prefix = BulkId.bulk_urn_decode(bulk_uri)
if prefix != BULK_URN_PREFIX_VERSION:
if prefix == BULK_URN_PREFIX_VERSION:
df = await dask_blob_storage.load_bulk(record_id, bulk_id)
elif prefix is None:
df = await get_dataframe(ctx, bulk_id)
else:
raise BulkNotFound(record_id=record_id, bulk_id=bulk_id)
df = await dask_blob_storage.load_bulk(record_id, bulk_id)
df = await DataFrameRender.process_params(df, ctrl_p)
return await DataFrameRender.df_render(df, ctrl_p, request.headers.get('Accept'))
except BulkError as ex:
ex.raise_as_http()
......@@ -347,7 +353,7 @@ async def complete_session(
record = await fetch_record(ctx, record_id, i_session.session.fromVersion)
previous_bulk_uri = None
bulk_urn = get_bulk_uri(record)
bulk_urn = get_bulk_uri_osdu(record)
if i_session.session.mode == SessionUpdateMode.Update and bulk_urn is not None:
previous_bulk_uri, _prefix = BulkId.bulk_urn_decode(bulk_urn)
......
......@@ -12,16 +12,16 @@
# See the License for the specific language governing permissions and
# limitations under the License.
from asyncio import iscoroutinefunction, gather
import uuid
from fastapi import FastAPI, HTTPException, status
from osdu.core.api.storage.tenant import Tenant
from asyncio import gather, iscoroutinefunction
from app.model import model_utils
from fastapi import FastAPI, HTTPException, status
from odes_storage.models import *
from osdu.core.api.storage.blob_storage_base import BlobStorageBase
from osdu.core.api.storage.exceptions import ResourceNotFoundException
from app.model import model_utils
from osdu.core.api.storage.tenant import Tenant
from ulid import ULID
async def no_check_appkey_token(appkey, token):
......@@ -60,12 +60,33 @@ class StorageRecordServiceBlobStorage:
self._container: str = container
self._auth_check = auth_check_coro
def _build_record_path(self, id: str, data_partition: str):
return f'{data_partition or "global"}_r_{id.replace(":", "_")}'
@staticmethod
def _get_record_folder(id: str, data_partition: str):
encoded_id = hash(id)
folder = f'{data_partition or "global"}_r_{encoded_id}'
return folder
async def _get_all_version_object(self, id: str, data_partition: str):
folder = self._get_record_folder(id, data_partition)
tenant = Tenant(project_id=self._project, bucket_name=self._container, data_partition_id=data_partition)
return sorted(await self._storage.list_objects(tenant=tenant, prefix=folder))
async def _build_record_path(self, id: str, data_partition: str, version=None):
folder = self._get_record_folder(id, data_partition)
if version:
return f'{folder}/{version}'
objects = await self._get_all_version_object(id, data_partition)
return objects[-1] if objects else None
async def _check_auth(self, appkey=None, token=None):
await self._auth_check(appkey, token)
@staticmethod
def _get_new_id_for_record(record: Record):
kind = record.kind.split(':')
return ':'.join((kind[0], kind[2], uuid.uuid4().hex))
async def create_or_update_records(self,
record: List[Record] = None,
data_partition_id: str = None,
......@@ -77,12 +98,12 @@ class StorageRecordServiceBlobStorage:
# insert id if new record
for rec in record_list:
if rec.id is None:
rec.id = str(uuid.uuid4())
rec.id = self._get_new_id_for_record(rec)# str(uuid.uuid4())
rec.version = int(ULID()) # generate new version -> ulid is sorted that helps us to know the latest version
await gather(*[
self._storage.upload(
Tenant(project_id=self._project, bucket_name=self._container, data_partition_id=data_partition_id),
self._build_record_path(record.id, data_partition_id),
await self._build_record_path(record.id, data_partition_id, version=rec.version),
model_utils.record_to_json(record),
content_type='application/json')
for record in record_list
......@@ -91,16 +112,20 @@ class StorageRecordServiceBlobStorage:
# manual for now
return CreateUpdateRecordsResponse(recordCount=len(record_list),
recordIds=[record.id for record in record_list],
recordIdVersions=[record.version for record in record_list],
skipped_record_ids=[])
async def get_record(self,
async def get_record_version(self,
id: str,
version: int,
data_partition_id: str = None,
appkey: str = None,
token: str = None) -> Record:
await self._check_auth(appkey, token)
object_name = self._build_record_path(id, data_partition_id)
try:
object_name = await self._build_record_path(id, data_partition_id, version=version)
if object_name is None:
raise ResourceNotFoundException("Item not found")
bin_data = await self._storage.download(
Tenant(project_id=self._project, bucket_name=self._container, data_partition_id=data_partition_id),
object_name)
......@@ -114,17 +139,18 @@ class StorageRecordServiceBlobStorage:
appkey: str = None,
token: str = None) -> RecordVersions:
# only one version /latest is supported
return RecordVersions(recordId=id, versions=[0])
objects = await self._get_all_version_object(id, data_partition_id)
versions = [o.split('/')[-1] for o in objects]
return RecordVersions(recordId=id, versions=versions)
async def get_record_version(self,
id: str,
version: int,
data_partition_id: str = None,
attribute: List[str] = None,
appkey: str = None,
token: str = None) -> Record:
# always return the latest
return await self.get_record(id, data_partition_id, appkey, token)
async def get_record(self,
id: str,
data_partition_id: str = None,
attribute: List[str] = None,
appkey: str = None,
token: str = None) -> Record:
# return the latest
return await self.get_record_version(id, None, data_partition_id, appkey, token)
async def delete_record(self,
id: str,
......@@ -132,13 +158,13 @@ class StorageRecordServiceBlobStorage:
appkey: str = None,
token: str = None) -> None:
await self._check_auth(appkey, token)
object_name = self._build_record_path(id, data_partition_id)
try:
await self._storage.delete(
Tenant(project_id=self._project, bucket_name=self._container, data_partition_id=data_partition_id),
object_name)
except FileNotFoundError:
raise HTTPException(status_code=status.HTTP_404_NOT_FOUND, detail="Item not found")
for object_name in await self._get_all_version_object(id, data_partition_id):
try:
await self._storage.delete(
Tenant(project_id=self._project, bucket_name=self._container, data_partition_id=data_partition_id),
object_name)
except FileNotFoundError:
raise HTTPException(status_code=status.HTTP_404_NOT_FOUND, detail="Item not found")
async def get_schema(self, kind, data_partition_id=None, appkey=None, token=None, *args, **kwargs):
raise NotImplementedError('StorageServiceBlobStorage.get_schema')
......@@ -50,6 +50,7 @@ from app.record_utils import fetch_record, update_records
router = APIRouter()
LOGS_API_BASE_PATH = '/logs'
async def get_persistence() -> Persistence:
return Persistence()
......
......@@ -20,6 +20,7 @@ from fastapi import FastAPI, Depends
from fastapi.openapi.utils import get_openapi
from app import __version__, __build_number__, __app_name__
from app import bulk_utils
from app.auth.auth import require_opendes_authorized_user
from app.conf import Config, check_environment
from app.errors.exception_handlers import add_exception_handlers
......@@ -49,7 +50,6 @@ from app.routers.trajectory import trajectory_ddms_v2
from app.routers.dipset import dipset_ddms_v2, dip_ddms_v2
from app.routers.logrecognition import log_recognition
from app.routers.search import search, fast_search, search_v3, fast_search_v3
from app.routers.ddms_v3 import bulk_v3
from app.clients import StorageRecordServiceClient, SearchServiceClient
from app.utils import (
get_http_client_session,
......@@ -222,7 +222,7 @@ wdms_app.include_router(
prefix=ALPHA_APIS_PREFIX + DDMS_V3_PATH + welllog_ddms_v3.WELL_LOGS_API_BASE_PATH,
tags=tags, dependencies=dependencies)
wdms_app.include_router(
bulk_v3.router_bulk,
bulk_utils.router_bulk,
prefix=ALPHA_APIS_PREFIX + DDMS_V3_PATH + welllog_ddms_v3.WELL_LOGS_API_BASE_PATH,
tags=tags, dependencies=dependencies)
......@@ -232,10 +232,20 @@ wdms_app.include_router(
prefix=ALPHA_APIS_PREFIX + DDMS_V3_PATH + wellbore_trajectory_ddms_v3.WELLBORE_TRAJECTORIES_API_BASE_PATH,
tags=tags, dependencies=dependencies)
wdms_app.include_router(
bulk_v3.router_bulk,
bulk_utils.router_bulk,
prefix=ALPHA_APIS_PREFIX + DDMS_V3_PATH + wellbore_trajectory_ddms_v3.WELLBORE_TRAJECTORIES_API_BASE_PATH,
tags=tags, dependencies=dependencies)
# log bulk v2 APIs
wdms_app.include_router(
sessions.router,
prefix=ALPHA_APIS_PREFIX + DDMS_V2_PATH + log_ddms_v2.LOGS_API_BASE_PATH,
tags=tags, dependencies=dependencies)
wdms_app.include_router(
bulk_utils.router_bulk,
prefix=ALPHA_APIS_PREFIX + DDMS_V2_PATH + log_ddms_v2.LOGS_API_BASE_PATH,
tags=tags, dependencies=dependencies)
# ------------- add alpha feature: ONLY MOUNTED IN DEV AND DA ENVs
def enable_alpha_feature():
""" must be called to enable and activate alpha feature"""
......
......@@ -27,9 +27,8 @@ cloudpickle==1.6.0
colorama==0.4.4
coverage==5.5
cryptography==3.4.7
dask==2021.4.1
dask[distributed]==2021.6.2
decorator==5.0.9
distributed==2021.4.1
fastapi==0.65.1
fsspec==2021.6.0
gcsfs==2021.6.0
......
......@@ -19,7 +19,7 @@ opencensus-ext-ocagent
opencensus-ext-logging
# for chunking feature
dask[distributed]==2021.4.1
dask[distributed]==2021.6.2
fsspec
--extra-index-url \
......
......@@ -7,6 +7,7 @@ httpx
numpy
pandas
pyarrow
python-ulid
# Note since 3.8 includes Mock 4.0+.
mock>=4.0
......