We provide IT Staff Augmentation Services!

Sr. Data Engineer/analyst Resume

3.00/5 (Submit Your Rating)

Dearborn, MI

SUMMARY

  • 9 plus years of Experience wif all phases of teh Software Development Life Cycle (SDLC) including analysis, Resource Planning, Code Development, testing, implementing and maintenance.
  • Over 9 Years of professional experience in Database development and Data Analysis design of OLTP and OLAP systems.
  • Highly TEMPeffective Data Scientist/Data Analyst wif over 6 years of experience in Data Analysis, Data Mining, Data Acquisition, Data Validation, Predictive Modeling, Data Visualization, Web Scraping, Testing.
  • Proficient in Statistical Programming Languages like R and Python including Big Data Technologies like Hadoop, Hive.
  • Extensively used GCP Cloud tools such as Big Query, Dataflow, Dataproc, Data Fusion, Cloud Functions etc.
  • Knowledge of both Private and Public Cloud Computing service market, models, offering types, and architecture concepts.
  • Experience on AWS Redshift Confidential Database to support large scale data storage and analysis, and also performed actions on large database migrations.
  • Assisted in designing and management of PostgreSQL database schemas and tables.
  • Experience wif NoSQL databases (Apache Cassandra) and NoSQL support in PostgreSQL.
  • Experience in developing Spark applications using Spark - SQL in Databricks for data extraction, transformation, and aggregation from multiple file formats for Analyzing & transforming teh data to uncover insights into teh customer usage patterns.
  • Good Experience in building data pipeline using Delta Lake on Google Cloud Storage and pull data from Databricks and BigQuery to visualize in Looker.
  • Collaborating wif Scatter Plot - Smart Data Compression. Data Load Editor Walk-through. Creating Drilldown Master Item Dimension. Data Preparation Capabilities wif Qlik Sense.
  • Performing daily base Data Analysis, System Analysis, Business Requirement Gathering and Data Warehousing concept.
  • Extensively used Microsoft office like Excel, PowerPoint, and Word documents to validate data, document requirements and present teh content to teh management/Team.
  • Good knowledge and experience on advanced analytics tools such as Mega HOPEX, QlikView, Qlik Sense, Tableau.
  • Good Knowledge on cloud-based databases, specifically AWS technologies such as RDS, S3, EMR, EC2, IAM roles and policies.
  • Excellent knowledge of classic data integration technologies including ETL/ELT, data replication/CDC, message-oriented data movement, API design, and access as well as emerging data ingestion and integration technologies like stream data integration, CEP, and data virtualization (HOPEX Tool).
  • Good Knowledge on Power BI, Python, R, Hadoop, Spark, Kafka, Rally, Java, C++, C#, Linux, UNIX, UNIX tools including object-oriented design.
  • Used Webex teams’ space to collaborate wif teh teams and wif Management.
  • Good experience on Root Cause wif defining teh problem, collecting teh data, identifying teh possible and root causes, evaluate solutions and solve teh problem.
  • Good Experience on understanding and analyzing Structured and Non-structured data.
  • Good Knowledge and experience in architecture of data marts, fact tables, dimension tables, data views wifin data marts/data warehouses, and data models like star schemas and snowflake schemas are examples of advanced data warehousing and data mining principles.
  • Extensively Worked on Mega HOPEX Tool to digitize teh data for Enterprise Architecture.
  • Have good experience in reading Architecture Artifacts (Tech07, CBID’s in VISIO and InfoPath) and MVP Template.
  • Trained team members in modeling technical architecture data (Application Environment, IT Infrastructure, Application System models) in HOPEX.
  • Trained other team members in modeling data of Application Environment.
  • Good Experience at architecture and design such as architecture, design patterns, reliability, and scaling.
  • Created a number of User Guides and published on EA portal.
  • Worked in HOPEX Tool to load architecture data using modeling methodology. Created hundreds of Application Environment models (Application Interface diagram) along wif its NFR (Non - Functional Requirements) and ATA (Application Technical Architecture).
  • Have good experience in IT Infrastructure of an application and modeled data in HOPEX, which can display data for an application's IT server, clients, upstream and downstream (partner applications), communication channels, networks, application technical architecture, etc.
  • Created an Application System Structure Diagram, which is a component of ASE Models, while working on teh HOPEX Application System Environment (ASE) to visualize teh data of Product Group, Product Line, and Product.
  • SPOC for Smart IT Connect Tickets raised by Architects, Product teams and Business Customers and, managed teh tickets, assigned them to teh team members as needed and completed/closed teh tickets wifin teh standard time.
  • Worked on Data Cleaning to eliminate bad data like retired and worked on adding related data in accordance wif teh needs of teh customers to teh Applications.
  • Worked on teh HOPEX Advance Query (ERQL Queries), which would aid in teh correction of data pertaining to partner applications, position type, interactions, exchange contracts, contents, and technologies.
  • Extensively used APM (Application Portfolio Management) QlikView Dashboard to identify changes/anomalies in data.
  • Good experience in data integration, modeling, data visualization, data cleaning to maintain data quality in teh repository which were directly relevant to data engineering.
  • Had good experience in using MVP Templates, Application Analysis Interface Templates, Import/ Bulk Load Templates to fix teh data of Applications in to HOPEX Application Environment (AE) Model.
  • Familiar in using Share Point (SP) Site.
  • Extensive experience in data migration techniques using SQL* Loader, Import/Export
  • Ran a number of Application Interface Integrity Report, Offline Reports, Capability Reports of HOPEX Models.
  • Conducted UAT for technology upgrades. Used/updated test plans for teh Technology Architecture (TA), tested various HOPEX Tool versions, collaborated wif other workstreams and Development teams and completed a successful UAT.
  • Extensively used EAMS (Enterprise Architecture Management System) web tool for training architects
  • Facilitated FSAC ( Confidential Software Architects Conference) presentations and recorded teh sessions.
  • Took teh lead role in Rally (Project management tool) and ran weekly status meetings and to track each phase of teh development iterations and releases. Strong teamwork and collaboration skills, wif track record of leading or working wif Architecture teams. Good experience of using TO Compass to analyze work of Team.
  • Defining teh Technical Architecture modeling and Architecture content management in teh Mega HOPEX and WEB, by maintaining new Meta classes per teh process and defining teh queries and reports.
  • Supported in preparation of Technology Roadmap and having good knowledge on Big Data and its concepts.
  • Expertise in data modeling principles/methods including conceptual, logical & physical Data Models.
  • Ability to quickly learn and adapt modeling methods from case studies or other proven approaches
  • Ability to clearly communicate complex technical ideas, regardless of teh technical capacity of teh audience
  • Strong inter-personal skills and ability to work as part of a team.
  • Skilled in visualizing, manipulating, and analyzing large datasets and TEMPhas ability of designing and developing TEMPeffective reports. Proficient wif MS Office Applicants, Excel (Pivot Table, Lookups, Index, Match, dashboards, etc.)
  • Experience in developing rich interactive Tableau reports and Dashboards using various visualizations like Chart-Box (Drill Down, Drill Up& Cyclic Grouping), Heat Maps, Tree Maps.
  • Bubble Charts, Reference Lines, Dual Axes, Line diagrams, Bar Graphs, Geographic Visualizations (Detailed and filled in maps), etc.
  • Skilled in Power BI and visualization of teh results using Tableau to import, analyze and create various filters and parameters for reports from various data sources such as SQL Server.
  • Proficient in Data Analysis, System Analysis, Business Requirement Gathering and Data Warehousing concept.
  • Experience transforming large, complex, unstructured data into denormalized star or snowflake schemas
  • Strong analytical skills wif teh ability to collect, organize, analyses and disseminate significant amounts of information wif attention to detail and accuracy.
  • Knowledge in Data Engineering, Business Analysis, Advanced Analytics, Data Analysis.

TECHNICAL SKILLS

  • Python & R for Data Science
  • HIVE
  • Big Data and Spark for Data Analytics and Data Engineering.
  • Cron and Autosys Scheduling
  • Unix Shell Scripting
  • Oracle SQL
  • PL/SQL TERADATA
  • HDFS
  • NoSQL
  • PostgreSQL and Greenplum
  • ERWIN
  • Jupyter Notebooks
  • Visual Studio Code
  • GIT and Groove Automation
  • Kubernetes
  • TFS
  • Power BI/ Tableau/Cognos/SSRS.
  • Maven 3.0
  • Google cloud
  • Azure
  • Docker
  • Kubernetes
  • CI/CD
  • PCF
  • ANT
  • GRADLE

PROFESSIONAL EXPERIENCE

Sr. Data Engineer/Analyst

Confidential - Dearborn, MI

Responsibilities:

  • Translated teh business requirements into workable functional and non-functional requirements at detailed production level using Workflow diagrams.
  • Actively working on GCP Cloud tools such as Big Query, Dataflow, Dataproc, Data Fusion, Cloud Functions etc.
  • Collaborating wif Scatter Plot - Smart Data Compression. Data Load Editor Walk-through. Creating Drilldown Master Item Dimension. Data Preparation Capabilities wif Qlik Sense.
  • Performing daily base Data Analysis, System Analysis, Business Requirement Gathering and Data Warehousing concept.
  • Working on Database development and Data Analysis design of OLTP and OLAP systems.
  • Working on AWS Redshift Confidential Database to support large scale data storage and analysis, and also performed actions on large database migrations.
  • Configured Jenkins to handle application deployment on Cloud (AWS) and to integrate wif Git Hub version
  • Extract Transform and load data from source systems to Azure date storage Service using combination of Azure date factory, T-SQL, Spark SQL and Azure Databrics.
  • Teh database interactions.
  • Collaborating wif other team members in designing and management of PostgreSQL database schemas and tables.
  • Control created teh Azure Functions using JAVA dat receives teh inputs from teh service calls and performs
  • Collaborating wif Scatter Plot - Smart Data Compression. Data Load Editor Walk-through. Creating Drilldown Master Item Dimension. Data Preparation Capabilities wif Qlik Sense.
  • Performing daily base Data Analysis, System Analysis, Business Requirement Gathering and Data Warehousing concept.
  • Involving in sessions wif business, project manager, Business Analyst, and other key people to understand teh Business Needs and propose a solution from a Warehouse standpoint.
  • Collaborating wif other teams in designing teh architecture of teh organization at teh enterprise level and thorough understanding and hands on experience of Software Development life cycle from Ideation to Implementation.
  • Actively working on building data pipeline using Delta Lake on Google Cloud Storage, and pull data from Databricks and BigQuery to visualize in Looker.
  • Significant number of tasks performing to collect data, organize, analyses and disseminate significant amounts of information wif attention to detail and accuracy.
  • Working on Data Visualization & Reporting tools such as Tableau, Power BI, Qlik Sense, QlikView.
  • Supporting Data Engineering, Business Analysis, Advanced Analytics, Data Analysis.
  • Excel, PowerPoint, and Word documents from Microsoft Office have been utilized extensively to validate data, establish requirements, and present content to management and team members.
  • Collaborated wif teh teams and management using teh Webex teams' environment.
  • Significant Work on teh Mega HOPEX Tool to digitize teh data for Enterprise Architecture.
  • Possessed solid reading skills for teh MVP Template and Tech07, CBID, and InfoPath artifacts
  • Trained team members to model data relevant to technology architecture using HOPEX (Application Environment, IT Infrastructure, and Application System Models). Trained other team members on Application Environment data modeling.
  • Instructed, trained, and monitored other team members on Application Environment data modeling.
  • Excellent understanding of emerging data intake and integration technologies including stream data integration, CEP, and data virtualization as well as traditional data ingestion and integration technologies like ETL/ELT, data replication/CDC, message-oriented data movement, and API design and access (HOPEX Tool).
  • Experience in developing Spark applications using Spark-SQL in Databricks for data extraction, transformation, and aggregation from multiple file formats for Analyzing & transforming teh data to uncover insights into teh customer usage patterns.
  • Involved in creating CI/CD Pipelines and Datasets using TFS and groove automation to load teh data onto data warehouse and documented Logical, Physical, relational, and dimensional Data Models, designed teh Data Marts in dimensional Data Modeling using star and Snowflake Schemas.
  • Worked on HOPEX Tool to visualize teh data of Solution Architect and created hundreds of Application Environment models along wif its NFR (Non - Functional Requirements) and ATA (Application Technical Architecture).
  • Extensive experience in data migration techniques using SQL* Loader, Import/Export
  • Having experience on HOPEX IT Infrastructure Models, which can display data from a solution architecture for an application's IT server, clients, upstream and downstream (partner applications), communication channels, networks, application technical architecture, etc.
  • Created an Application System Structure Diagram, which is a component of ASE Models, while working on teh HOPEX Application System Environment (ASE) to visualize teh data of Product Group, Product Line, and Product.
  • Worked on Data Cleaning to Support IT and Smart IT Connect Tickets, which fixed to eliminate bad data like retired and worked on adding related data in accordance wif teh needs of teh customers to teh Applications.
  • Ran Reports on Application Interface Integrity Report, Offline Reports, Capability Reports of HOPEX Models.
  • Worked on teh HOPEX Advance Query, which would aid in teh correction of data pertaining to partner applications, position types, interactions, exchange contracts, contents, and technologies.
  • Contributed on developing test plans for teh Technology Architecture (TA), tested various HOPEX Tool versions, and completed a successful UAT.
  • Good experience in data integration, modeling, data visualization, data cleaning to maintain data quality in teh repository which were directly relevant to data engineering.
  • Created and revised a number of HOPEX Modeling user guides dat were published on teh Enterprise Architecture (EA) Portal.
  • Used Forward Engineering to create a Physical Data Model wif DDL dat best suits teh requirements from teh Logical Data Model.
  • Conducted Design reviews wif Business Analysts, Database Administrators and Content Developers to validate teh models.
  • Done Performance Tuning, Query Optimization, Client/Server Connectivity, and Database Consistency Checks.
  • Manage HOPEX administration and configuration activities such as User profile and access management, Repository object management, and administration.
  • Proposed new features in HOPEX Tool to enhance teh capabilities and to make more user-friendly tool to Modelers, customers and as well as for trainees. And those features were approved to build.
  • Created and maintained teh Data Model Repository as per company standards.
  • Worked wif teh Application Development team to implement appropriate data strategies.
  • Good experience working wif popular data discovery, analytics, and BI software tools like PowerBI, QlikView, Tableau, etc.,
  • FSAC ( Confidential Software Architects Conference) lectures were facilitated and recorded teh sessions.
  • Played a major role in Rally Board (Agile). Had good experience in creating and maintaining Features, User Stories, Tasks wif Release, Iteration, Plan Estimation.
  • Played a major in role in maintaining, supporting, and monitoring User Stories of my Team wif velocity, capabilities for each Iteration.
  • Having good TOCompass experience would help a team examine and improve their work pattern wifin teh organization.
  • Completed training of Rally 101,102,103 and 104.

Environment: C, C++, Rally, APM, Java, Core Java, Mega HOPEX, Provision, SQL, PL/SQL, QlikView, Tableau, ArchiMate, ASP.Net, C# .Net.SQL Server Management Studio, Microsoft Office (PowerPoint, Excel, Word)

Sr. Data Analyst/Engineer

Confidential, Southfield, MI

Responsibilities:

  • Involved in sessions wif business, project manager, Business Analyst, and other key people to understand teh Business Needs and propose a solution from a Warehouse standpoint.
  • Standard in designing teh architecture of teh organization at teh enterprise level and thorough understanding and hands on experience of Software Development life cycle from Ideation to Implementation.
  • Implemented various AWS services such as AWS EC2 instance, SQS, Kinesis and S3 bucket and configured security groups for accessibility from on premise environments
  • Managed IAM security policies, Roles and keypairs for multiple instances such a Windows and Linux.
  • Worked on implementing AWS NAT gateways and NAT instances wif spread instances.
  • Built CI/CD pipelines for end-to-end data processing using groove automation
  • Designed teh ER diagrams, logical model (relationship, cardinality, attributes, and candidate keys) and physical database (capacity planning, object creation and aggregation strategies) for Oracle and Teradata as per business requirements using ER Studio.
  • Implemented metadata standards, data governance and stewardship, master data management, ETL, ODS, data warehouse, data marts, reporting, dashboard, analytics, segmentation, and predictive modeling.
  • Worked wif SME's and other stakeholders to determine teh Requirements to identify Entities and Attributes to build Conceptual, Logical and Physical Data Models and designed an Industry standard Data Model specific to teh company wif group insurance offerings, translated teh Business Requirements into detailed production level using Workflow Diagrams, Sequence Diagrams, Activity Diagrams and Use Case Modeling.
  • Responsible for Analyzing report requirements and developing teh reports by writing Teradata SQL Queries and using MS Excel, Power Point and UNIX.
  • Designed Database Maintenance Planner for teh Performance of SQL Server, which covers Database Integrity Checks, Update Database Statistics and Re-indexing.
  • Created a high-level industry standard, generalized Data Model to convert it into Logical and Physical Model at later stages of teh project using ER-Studio.
  • Developed Stored Procedures, Functions & Packages to implement teh logic at teh server end on Oracle and Performed Application/ SQL Tuning using Explain Plan, SQL Tracing & TKPROF. Also, used Materialized View for teh Reporting Requirement.
  • Developed numerous Teradata SQL Queries by creating SET or MULTISET Tables, Views, Volatile Tables, using Inner and Outer Joins, Using Date Function, String Function and Advanced techniques like RANK and ROW NUMBER functions.
  • Traced and catalogue data processes, transformation logic and manual adjustments to identify data governance issues.
  • Involved in extensive DATA validation using SQL queries and back-end testing and generated DDL statements for teh creation of new ER/studio objects like table, views, indexes, packages, and stored procedures.
  • Conducted design walk through sessions wif Business Intelligence team to ensure dat Reporting Requirements are met for teh Business and developed Data Mapping, Data Governance, and Transformation and Cleansing Rules for teh Master Data Management Architecture involving OLTP, ODS.
  • Collaborated wif ETL, BI and DBA teams while working on, SQL Server, Teradata to Analyze and provide solutions to Data issues and other challenges while implementing teh OLAP model.
  • Performed dimensional Modeling on OLAP system using Ralph Kimball methodologies and Extracted data from Oracle 11g and upload to Teradata tables using Teradata utilities FASTLOAD & Multiload.
  • Generated ad-hoc reports using Crystal Reports 9 and Prepared Analytical and Status Reports and Updated teh project plan as required.
  • Worked closely wif teh ETL SSIS Developers to explain teh complex Data Transformation using Logic and create and deploy reports using SSRS.
  • Utilized Power Query in Power BI to Pivot and Un-pivot teh data model for data cleansing and data massaging.

Environment: ER-Studio, Python, Teradata, Netezza, Oracle 12c, Cognos, MS Office (Word, Excel and PowerPoint), SQL Server, MS Project, MS FrontPage, MS Access, EDI, UML, MS Visio, Oracle Designer, SQL Server, Oracle SQL developer, Crystal Reports, SSRS, SSIS and Tableau.

Sr. Data Engineer/Analyst

Confidential

Responsibilities:

  • Worked wif team of data scientists, machine learning engineers, software engineers and QA engineers.
  • Performed data collection, preprocessing, feature engineering, data visualization and analysis.
  • Build automation of data collection and preprocessing and built models to address business problems.
  • Supported preparation of technology roadmap.
  • Worked on Matillion ETL wif teh Enterprise IT team to maintain and develop data integration solutions to support our Analytics platform.
  • Migrated and Transform Data from Different Sources and provided production support for existing products dat include
  • Worked extensively SSIS, SQL Server, Stored Procedures, data marts, Matillion, AWS, and Snowflakes.
  • Engaged wif lines of business, users, and analysts to explore and prototype opportunities and used cases exploiting data and teh application of cognitive and machine learning technologies.
  • Designed, developed, test and support Cognitive Microservices to operationalize and productize deployment of resulting models and cognitive solutions.
  • Worked wif Business users to know teh Business Requirements and Gatheird teh entire detail Requirements from Business for reporting.
  • Implemented logical and physical data modeling wif techniques using Erwin in Data Mart and Used Reverse Engineering approach to redefine entities, relationships, and attributes in teh data model as per new specifications in Erwin after analyzing teh database systems currently in use Performed Dimensional Modeling on OLAP system using Ralph Kimball methodologies and performed gap Analysis.
  • Used ERWIN column property editor to create and named physical properties, ERWIN index editor to create, named, sort and cluster for performance tuning for physical design, ERWIN physical object editor and table property editor to create named physical objects for physical design and ERWIN synchronization dialog to sync physical objects wif teh target environment.
  • Involved using ETL tool Informatica to populate teh database, data transformation from teh old database to teh new database using Oracle and involved in teh creation, maintenance of Data Warehouse and repositories containing Metadata.
  • Designed teh procedures for getting teh data from all systems to Data Warehousing system. Teh data was standardized to store various Business Units in tables.
  • Created data governance and privacy policies specifying contract implementation and Linked data lineage to data quality and business glossary work wifin teh overall data governance program.
  • Used forward engineering to create a Physical Data Model wif DDL dat best suits teh requirements from teh Logical Data Model and Used Erwin for reverse engineering to connect to existing database and ODS to create graphical representation in teh form of Entity Relationships and elicit more information Conducted design reviews wif Business Analysts, Enterprise Data architect and solution lead to create proof of concept for teh reports and involved in Business process modeling using UML through Rational Rose and maintained Metadata, version controlling of teh Data Model.
  • Created source to target Mapping Documents for renal and immune disease Data Mart wif all teh upstream sources.
  • Wrote SQL scripts for creating tables, Sequences, Triggers, views, and materialized views and wrote SQL scripts for Loading Data from staging area to target tables and involved heavily in writing complex SQL queries to pull teh required information from Database using Teradata SQL Assistance.
  • Transformed complex business logic into Database design and maintaining it by using SQL objects like Stored Procedures, User Defined Functions, Views, T-SQL Scripting and Jobs.
  • Did performance Analysis and Created partitions, indexes, and Aggregate tables where necessary and broken down teh hierarchies from teh dimensions to create teh roll up dimensions

Environment: Informatica, MS Visio, C, C++, Java, Core Java, SQL, PL/SQL, QlikView, Tableau, ASP.Net, C# .Net, Power BI.

Data Engineer

Confidential

Responsibilities:

  • Supervised training and developmental of software engineering staff and other resources.
  • Analysis reports prototypes from teh Business Analysts belonging to different Business units; Participated in requirements sessions involving teh discussion of various reporting needs.
  • Provided support to other developers in accurately mapping source attributes into teh Teradata Financial Services Logical Data Model (FSLDM) and in interpreting business requirements Reverse Engineering teh existing Data Marts and Identified teh Data Elements (in teh source systems), Dimensions, Facts and Measures required for reports.
  • Conduct Design discussions and meetings to come out wif teh appropriate Data Warehouse at teh lowest level of grain for each of teh Dimensions involved.
  • Designed a STAR Schema for sales Data Involving shared dimensions (Conformed) for other subject areas using Erwin Data Modeler.
  • Reverse engineered teh reports and identified teh Data Elements (in teh source systems), Dimensions, Facts and Measures required for new enhancements of reports.
  • Created ETL Jobs and Custom Transfer Components to move data from Oracle Source Systems to SQL Server using SSIS, Designed and created data extracts, supporting SSRS, POWER BI, Tableau, or other visualization tools reporting applications.
  • Performed dimensional Modeling on OLAP system and created and maintained Logical Data Model (LDM) for teh project and includes documentation of all entities, attributes, data relationships, primary and foreign key structures, allowed values, codes, business rules, glossary terms, etc.
  • Involved in loading of data into Teradata from legacy systems and flat files using complex MultiLoad scripts and FastLoad scripts.
  • Assigned work to software engineering personnel and evaluated and managed their performance.
  • Supported to recruit engineering personnel, technical staffing, and project team's formation.
  • Supervised reporting responsibilities for independent software contractors, software engineers and outsourcing partners.
  • Ensure dat software engineering personnel are properly trained to perform all job functions and dat project and software related tools are used appropriately.

Environment: Windows-XP/2000/NT/98/95, MS-DOS, C, C++, Java, Core Java, SQL, PL/SQL, ASP.Net, C# .Net

We'd love your feedback!