Metadata-Version: 2.1
Name: delpha-db-manager
Version: 1.2.1
Summary: Delpha Database Management System
Home-page: https://github.com/Delpha-Assistant/DelphaDBManagement
Author: Hugo Paigneau
Author-email: hugo.paigneau@delpha.io
License: MIT
Description: # DelphaDBManagement
        
        Management API for Delpha to work around Data inside the Cassandra Cluster
        
        This repository is python based yet, to be open to other languages (Java, JavaScript, ...)
        Module to handle DB Managmenet on our Cassandra Cluster.
        
        
        ![title](docs/images/db1.png)
        
        # Security Objectives
        
        Need a really secure way to handle data : 
        
        **Private Keys** will be generated by Delpha only and will be stored inside an authentication table inside Cassandra. Users will need to match a given private key to access the DBMS. This key will be bound to the dedicated organisation, and only one Private Key will be tagged as Admin and unlock all the DBMS access.
        
        ![title](docs/images/db3.png)
        
        ![title](docs/images/db2.png)
        
        
        # Installation
        ### Requirements
        **Python** : >3.8
        **cassandra-driver**
        **simple-salesforce**
        **requests**
        **boto3**
        
        ### Install Dependencies
        
        Common
        
        ```bash
        pip install -r requirements.txt
        ```
        
        ### Salesforce Manager
        
        Salesforce Data Manager using Salesforce API.
        
        - Parameter : instance_name: Salesforce Instance name
        - Parameter : client_id: Consumer ID for authorized app
        - Parameter : client_secret: Consumer Secret for authorized app
        - Parameter : username: usernam to login to Instance of Salesforce
        - Parameter : param password: password to login to Instance of Salesforce
        - Parameter : param security_token: User's security token
        
        
        ```python
        sf_manager = SalesforceManager(instance_name, consumer_key, consumer_secret, salesforce_username, salesforce_pwd, personnal_token)
        sf_manager.help()
        res, size = sf_manager.query("SELECT Name from Contact")
        ```
        
        
        ### Cassandra Manager
        
        Cassandra Database manager : **CassandraManager**.
        
        - Parameter : pem_file_path: String path of the .pem file to use for 2FA connection
        - Parameter : config_file_path: String path of the .yalk file to use to get credentials
        - Parameter : key_space: String session keyspace to set for the Cluster
        
        ```python
        cass_cluster = CassandraManager(key_file_path, pem_file_path, "delpha_actions")
        cass_cluster.execute("SELECT * FROM actions_conv_by_user").all()
        ```
        
        
        Dictionnary part of Cassandra Database manager : **DictionaryManager**.
        
        ```python
        manager = DictionaryManager(cass_cluster)
        manager.list_tables
        ```
        ~~~python=1
        ['keyspace1', 'keyspace2', ...]
        ~~~
        
        
        
        ### AppFlow Manager
        
        Delpha AWS data Handler (AppFlows) : **AppFlowManager**.
        
        - Parameter : key: String AWS key
        - Parameter : secret: String AWS secret
        - Parameter : flow_type: String Flow type to handle ['contact', 'account']
        - Parameter : bucket_name: String to use specific bucket
        - Parameter : region: String AWS region 
        
        ```python
        handler = AppFlowManager(aws_key_id, aws_secret, bucket_name,'contact') #Setup contact Flow (contact dataset)
        df = handler.get_flow_parquet_data()
        ```
        
        
        ```python
        flow_status, flow_last_execution_record = handler.start_flow(flow_name) #Start flow on AWS side
        
        handler.ensure_spark_format(flow_name) #Check if data is in right format (.parquet)
        
        handler.get_last_flow_id(flow_name) #Get last folder Name in bucket for specific flow.
        ```
        
        ### S3 Manager
        
        Delpha AWS data Handler (AppFlows) : **S3Manager**.
        
        - Parameter : key: String AWS key
        - Parameter : secret: String AWS secret
        - Parameter : region: String AWS region 
        
        ```python
        S3 = S3Manager(aws_key_id, aws_secret, region) #Setup contact Flow (contact dataset)
        df = S3.buckets
        ```
        
        
        ```python
        flow_status, flow_last_execution_record = handler.start_flow(flow_name) #Start flow on AWS side
        
        handler.ensure_spark_format(flow_name) #Check if data is in right format (.parquet)
        
        handler.get_last_flow_id(flow_name) #Get last folder Name in bucket for specific flow.
        ```
        
        
        ### How to - Pip publish package
        
        ```shell
        python setup.py sdist bdist_wheel
        twine upload dist/*
        ```
        
        ## License
        
Keywords: database,salesforce api,cassandra api,aws api
Platform: UNKNOWN
Classifier: Intended Audience :: Developers
Classifier: License :: OSI Approved :: MIT License
Classifier: Operating System :: OS Independent
Classifier: Programming Language :: Python
Classifier: Programming Language :: Python :: 3
Classifier: Topic :: Software Development :: Libraries :: Python Modules
Description-Content-Type: text/markdown
