We provide IT Staff Augmentation Services!

Bi Data Engineer / Architect Resume

Dallas Tx Atlanta, GA

SUMMARY:

  • More than Fifteen years of IT Experience including over Twelve years of expertise in Administering, Design Complex Applications with emphasis on Design & Development of Data Solutions, Business Intelligence Reporting, ETL, Development, Testing, Documentation.
  • Hands on experience with Big Data Technologies including Apache Hadoop, Data Lake, Hive, HBase, Mongo DB, Cassandra, Pig, Spark 2, Sqoop, Oozie, Kafka, Flume, Nifi, Airflow, Google Big Query, RedShift, Azure SQL Datawarehouse, Python and R Scripting, Jupyter Notebook, Zeppelin.
  • Experience in Design / Development and managing of Complex Data Application with Informatica, SSIS, ODI 11g, 12c, Data stage, Talend, Alteryx, Pentaho, and Python.
  • Attend User meeting to gather requirement for Analytics / Web Application. Propose new solutions and innovative ways to achieve operational excellence
  • Monitor and Create Alerts for critical KPI’s, Metrics, Data Visualization for Business Process.
  • Experience with IAM, VPC, Compute, Storage, Users, DNS, Firewall and SSO setup.
  • Experience working with AWS S3 , S3 Glacier, AWS Redshift, Redshift Spectrum, AWS Glue, AWS Data Pipeline, EMR, AWS Lambda, AWS QuickSight, AWS Athena.
  • Experience with Tableau Desktop, Tableau Server, QlikView Data extracts, complex dashboards using extensive use of Blending, Actions, Calculated fields, Parameters, Set Analysis.
  • Experience creating and managing Dashboards, Reports with Tableau, Power BI, QlikView, Business Objects, MicroStrategy, Quick Sight, Data Studio, SSRS, SSAS, Cognos, OBIEE 11g and 12c.
  • Experience with SSIS, SSRS, SSAS, Power BI Desktop, Power BI Services, M Language Interactions.
  • Experience in Data Analysis / Visualization using Python, and R to Clean - up, Visualize.
  • Experience working with Azure Blob Storage, Azure Data Lake, Azure Data Factory, Azure SQL, Azure SQL Datawarehouse, Azure Analytics, Polybase, Azure HDInsight, Azure Databricks.
  • Experience working with Google Cloud Storage, Big Query, Python & SQL Scripts, Google BigQuery, Dataflow, App Engine, Google Compute Engines, Kubernetes.
  • Experience include using Jenkins, Docker, Kubernetes, Chef, Terraform, Cloud Formation.
  • Experience setting up Portal Applications using Angular JS, Python Django, Node.js, Type Script, HTML, CSS application and integrate BI Applications. Deploy Application in serverless App Engine.
  • Develop Dimensional, Hierarchies, Measures, Aggregation and adding multiple sources.
  • Experience include Performance Tuning of SQL’s by implementing the materialized views, bitmapped indexes, partitions, and aggregate tables. Develop Dimensional Model using Kimball Model and work on Source/ Target Data mapping.

PROFESSIONAL EXPERIENCE:

Confidential, Dallas, TX / Atlanta, GA

BI Data Engineer / Architect

Environment: Informatica, DataStage, Talend, Tableau, Power BI, Hadoop, Spark, Hive, SSIS, SSRS, SSAS, Redshift, Glue, Athena, Data Catalog, EMR, Kinesis, GCP, Angular JS, Node.js, Type Script, App

Responsibilities:

  • Create Informatica and Talend Mappings / Jobs to build One-time, Full Load and Incremental Loads. Apply Data Fixes as per the discussion with Business.
  • Implement Data Lake in Google Big Query, Google Cloud Storage, SQL Scripts to load data to BigQuery, Composer ( Confidential ) for running the Talend and Query Scripts.
  • Work on Spark, Spark Streaming, Hive, Pig, Sqoop, Flume, and Kafka scripts / Configuration to Stream, cleanse, prepare, load data
  • Setup Metadata Management in Postgres DB to manage the Full, Incremental load.
  • Setup Kubernetes Cluster, Docker Image for running Talend jar Files to load data to Data Lake
  • Design BI Applications in Tableau, Power BI, SSIS, SSRS, SSAS, Informatica, Big Data / Cloud.
  • Coordinate the execution of UAT, Regression testing and resolution for issues identified.
  • Create Executive, Analytical, Strategic and Operational Dashboard with Various KPI’s and Goals for Business Users. Use Drill down and Detailed Report to have line level items about the trend.
  • Setup Portal Application in Angular JS, Python Django, Node.js, Type Script, HTML, CSS to integrate various Application in App Engine.
  • Create real-time data streaming solutions using Kafka and Flume. Setup ingestion of data using Sqoop
  • Implement GCS, Big Query, App Engine, Data Flow, App Engine, Kubernetes, Google Container Engine, VPC.
  • Use AWS Glue to crawl the data lake in S3 to populate the Data Catalog. Use Talend Job to Load data from Various Source systems (Oracle, SQL Server, FTP Server) to S3. Use Spark / Python / Copy Commands Scripts to transform Business logic in S3. Use AWS Lambda, AWS QuickSight, AWS Athena for Analytics.

Confidential, Dallas, TX

Lead Consultant

Environment: Informatica, Talend, Tableau, Power BI, Python, Sqoop, Spark, SSIS, SSRS, SSAS, Azure.

Responsibilities:

  • Design BI Applications in Tableau, Power BI, SSIS, SSRS, SSAS, Informatica, Data Stage.
  • Implement Data Lake in Azure Blob Storage, Azure Data Lake, Azure Analytics, Databricks Data load to Azure SQL Datawarehouse using Polybase, Azure Data Factory
  • Work on Python / Bash / SQL Scripts to Load data from Data Lake to Datawarehouse.
  • Work on SQL Scripts, Stored Procedures, Queries, Packages to Load Data in SQL Server, Oracle and SQL Datawarehouse. Work on Package Configuration to setup automated ETL load processing for one time and incremental Data Loads.
  • Setup Infra as Code / Automation using Chef, Jenkins, Kubernetes, Docker.

Confidential, Buffalo, NY

Data Architect

Environment: Informatica, DataStage, Talend, Tableau, Python, Hadoop, Hive, SSIS, SSRS, SSAS

Responsibilities:

  • Work on Spark, Spark Streaming, Hive, Pig, Sqoop, Flume, and Kafka scripts / Configuration to Stream, cleanse, prepare, load data to Data Lake, Data warehouse, Advanced Analytics, and ML Processing.
  • Experience tuning and troubleshooting performance issues with hive queries. Design and create Hive data models with partitioning, dynamic partitioning, and buckets
  • Work with Oozie and Airflow workflow scheduler to manage Hadoop jobs by Direct Acyclic Graph (DAG). Experience in working on different file formats like Avro, Parquet, ORC, Sequence.
  • Implement AWS Kinesis to sink data to Redshift, S3

Confidential, Detroit, MI

Data Consultant / Lead

Technologies: Informatica, DataStage, Talend, UNIX, Tableau, AWS Redshift, EMR, Kinesis.

Responsibilities:

  • Create ETL Mappings for the Operational dashboard for various KPIs, Business Metrics, allow powerful drill down, for Detail reports to understand the data at a very detailed level.
  • Work on Data lake in AWS S3, Copy Data to Redshift, Custom SQL’s to implement business Logic using Unix and Python Script Orchestration for Analytics Solutions.
  • Partition the tables, Adding Parallel configuration at the table level based on the long-running query analysis.

Confidential, Tampa, FL

BI Lead / Architect

Technologies: Tableau, SSIS, SSRS, SSAS, OBIEE 11g, ODI 11g, Cognos 10.2, QlikView 11, Python, R, Informatica, Alteryx, Oracle 11g, Power BI, AWS, Red Shift, Hadoop, Hive, HBase

Responsibilities:

  • Design BI Applications in Tableau, QlikView, SSIS, SSRS, SSAS, OBIEE, Cognos, Informatica.
  • Part of the Agile BI / ETL Team and attend regular User Meeting to go through the requirement for the Data / BI Sprints. Highly Visible Data flow, Dashboards and reports are created based on the User Stories.
  • Work on Python / Scala / Bash Scripts / ETL Jobs to Load Data in Hadoop HDFS Cluster, Data Lake, and DW. Work on Hive, Sqoop, oozie, Flume, and PIG scripts to cleanse, prepare and Load Data in Data warehouse.
  • Create and maintain the log, Metadata and Audit Tables for preserving the integrity of data lake.
  • Maintain the Big Data Cluster from Ambari to monitor the services, Create Alerts and Restart.
  • Work with System Administrators / Architect on CPU, Memory and Storage perspective.

Confidential, Dallas

BI Lead

Technologies: Oracle 11g, BO 4, MicroStrategy 9, SSIS, SSRS, SSAS, QlikView 11

Responsibilities:

  • Create SSIS data load packages, Business Objects, SSRS, SSAS and QlikView Reports, Dashboard.
  • Develop Load Balance, Stress testing strategy, coordinate the execution of testing and facilitate problem resolution for issues identified.
  • Setup Data models as per the Kimball Model and create Operational Performance, Provider Performance, Loyalty and Credit Card Application.

Confidential, Dallas

Senior Consultant

Technologies: Cognos, Spotfire 4, Talend, SSIS, SSRS, OBIEE 11g, Oracle 10g, Informatica, Oracle

Responsibilities:

  • Part of ETL / BI projects for Sales, Marketing, Customer, and Operational Dashboards.
  • Create ETL; BI reports related to Inventory Position, Plan Performance, and Customer Contribution, Order Summary, Employee and Termination listing, Salary and new hire listing, over 55-hour report.
  • Create SSRS Reports and SSAS Cubes for Operational Reporting environment.
  • Create / Modify Data models in Cognos / OBIEE. Create Reports in Report Studio, Query Studio, Analysis Studio, OBIEE Analytics based on the Requirements.

Confidential, Atlanta, GA

ETL / BI Architect / Consultant / Admin

Technologies: Oracle 11g, Informatica, Oracle 10g, 9i, SQL Server, OBIEE 11g, Business Objects 3

Responsibilities:

  • Develop Informatica Mappings and Workflows as per the Data model
  • Optimized the external file feeds to work in conjunction with the Data Models / Process.
  • Design BI Reports, Dashboards and work with users on Data Validation, Format, and Signoff
  • Analyze issues reported in operational BI environment and provide an appropriate workaround / Long Term design / Data Solution.

Hire Now