AET Modern Data Platform

Speed, Scalability and Security are the needs of Data-Driven organizations today. AET's reference architecture for Modern Data Platform is based on open standards and open source innovations. AET Modern Data Platform addresses the constraints of traditional data-lakes or data warehouses that are faced by Enterprises and Govt organizations. AET leverages on container technologies to flexibly cater for today's use-cases that are resource-intensive and short-lived. Very often, businesses need urgent data needs and use cases that need resources to be scaled up and down as flexibly as required. AET Container Data Platform architecture takes into account scalability, reliability, security, performance and cost savings.

aet-architecture

AE Data Visualization

DATA EXPLORATION & VISUALIZATION - Many businesses are falling short by lacking a clear understanding of their data and its business value. Without the right tools to visualize the data, operational and business leaders may fail to correct and optimize their strategies on the go.

With AE Data Visualization, easily transform massive data sets into actionable insights. The powerful, fully customisable graphical chart and dashboard development interface can be adapted to convert raw data into decision-ready information. AE Data Visualization is a cloud-ready, web-based application that can effortlessly integrate with an extensive data source support - PostgreSQL, MySQL, Oracle, MSSQL, Hive, Elastic- Search, Druid, and cloud databases. AE Data Visualization is equipped with extensive list of supported chart types which offers decision-makers meaningful ways to discover relationships and patterns between business and operational activities. Underpinned by In-memory OLAP engine for fast exploratory analytics powered by Apache Druid, users can easily discover insights and business intelligence quicker than ever before.

Combining both advanced technical capabilities and easy-to-use features, AE Data Visualization’s flexible and robust abilities will help you connect the dots and uncover the true value of your datasets.



Explore & Visualize

Make it easy to slice, dice and visualize data

View

View your data through interactive dashboards

Investigate

Use SQL Lab to write queries to explore your data




Architecture




Web-based

User-friendly web-based application

Smart Dashboards

Multiple views of data to get richer insight

Modern Architecture

Highly scalable leveraging the power of your existing data infrastructure

Ease of Use

Easy-to-use interface for exploring and visualizing data

Seamless Integration

Connect to any SQL based data source through SQL Alchemy

Powerful Tool

Powerful business intelligence tool with flexible data visualization options




Minimum Requirements

  • Node Specification
    • 4 cores
    • 16GB RAM (32GB recommended)
    • Minimum 120GB SSD for OS (RAID 1 recommended)
    • Minimum 200GB SSD for data (RAID 1 or RAID 4 recommended)
    • 1 GbE Network (10GbE Network recommended in cluster setup)
  • No HA setup - 1 node minimum
  • HA setup - 3 nodes minimum + 1 load balancer node
  • Pricing / Packages

  • Installation service (Core based pricing model, unlimited users*) + 1 day User Training (not including data engineering / report development service)
  • On-premise Installation Priced by block of 4 cores
  • Unlimited users, but recommended up to 10 concurrent users, or up to 2GB data for each 4 core packages
  • 3 named support users
  • Single pricing bundled with:
    • AE Linux Server
    • AE In-Memory OLAP (Druid)
    • AE PostgreSQL Database
    • AE Task Queue (RabbitMQ)

  • AE Data Workflow

    WORKFLOW MANAGEMENT PLATFORM - With huge amounts of data being generated every second from business transactions, sales figures, customer logs, and stakeholders, data is the fuel that drives companies. The exponential growth of data has made it a challenge for businesses to effectively handle the increased users, data volume, and data complexity.

    AE Data Workflow is an excellent tool to seamlessly organize, execute, and monitor data workflows. AE Data Workflow provides a platform for scheduling and monitoring workflows which is useful in architecting and orchestrating complex data pipelines.

    AE Data Workflow is cloud-ready and it is highly flexible as it is designed to work within an architecture that is standard for nearly every software development environment. Due to its flexibility and robustness, AE Data Workflow could be used as a powerful workflow management platform.



    Open-source Task Scheduler

    Allows users to programmatically author, build, monitor, and share work- flows in the cloud

    Create & Track Data Flows

    Allows users to create workflows with high granularity and track the progress as they execute

    Manage Data Flows

    Provides tools that make it easier to manage data-flows and data processing steps in an integrated manner




    Architecture




    Useful User Interface

    Monitor, schedule and manage your workflow via a robust & modern web application

    Automation

    Automate processes, fetching data, machine learning pipelines, data transfer, and monitoring

    Modern Architecture

    Highly scalable leveraging the power of your existing data infrastructure

    Easy-to-use

    With Python knowledge easily build machine learning models, transfer data, & manage your infrastructure

    Robust Integration

    Provides many plug-and-play operators, easy to apply to current infrastructure and extend to next-gen tech

    Powerful Tool

    Powerful business intelligence tool with flexible data visualization options




    Minimum Requirements

  • Node Specification
    • 4 cores
    • 16GB RAM (32GB recommended)
    • Minimum 120GB SSD for OS (RAID 1 recommended)
    • Minimum 200GB SSD for data (RAID 1 or RAID 4 recommended)
    • 1 GbE Network (10GbE Network recommended in cluster setup)
  • Non-distributed setup
    • 1 compute node
  • Distributed compute setup:
    • 2 compute nodes minimum
    • 1 master node minimum
    • 1 edge node minimum
  • Distributed compute setup:
    • 2 compute nodes minimum
    • 1 master node minimum
    • 1 edge node minimum
  • Pricing / Packages

  • Installation service + Training Data Preparation with Python
  • Training Distributed Data Preparation with Python & Apache Spark
  • Single pricing bundled with:
    • AE Linux Server
    • AE Compute Cluster (Apache Spark)
    • AE PostgreSQL Database (for metadata)
    • AE Task Queue (RabbitMQ)

  • Contact Us

    [email protected]