We provide IT Staff Augmentation Services!

Etl-lead/developer Resume

5.00/5 (Submit Your Rating)

SUMMARY:

  • 12+ Years of working experience in Enterprise Data warehousing and Business Intelligence Domain.
  • Demonstrated ability to lead a project team through the complex phases of a business transformation, solution design, implementation and support.
  • Worked extensively on the ETL Technologies/Tools DataStage, Informatica, SSIS and as a Architect/Designer/Developer
  • Experience in BIG DATA technologies and related tools like Hadoop, Snowflake, Hive, Cloudera - Impala, Python etc
  • Extensively worked on Infophere components such as Datastage, QualityStage, Business Glossary, FastTrack, Information Analyser
  • Experience in Data Migration, Data modeling, Data Mart Building and Data mining/Data profiling techniques.
  • Experience in BIG DATA and CLOUD Based Technologies like HADOOP, BIRST, AZURE,AWS,SNOWFLAKE etc.
  • Experience working in cluster and GRID architectures like IBM information server, BIRST cloud based analytical engine
  • Experience to Develop Dashboards/Reporting on SSRS, OBIEE suite, BIRST reporting
  • Significant years of Working experience in Retail, Finance, Banking and Insurance Organizations.
  • Business Analysis, Data and Source Elements Mapping
  • Technical architecture, defining and designing interfaces for Enterprise level data conversions, designing overall ETL process, defining object and data level security.
  • Experience in implementing Business logic using ETL components, Triggers, Views and Stored Procedures
  • Development, Testing and Performance Tuning and Optimization of ETL of Enterprise level Data conversions, PL/SQL and T-SQL Store Procedures/Queries and reporting components.
  • Atomization, Scheduling, deployment and monitoring of ETL components and Packages
  • Extensive experience in Design & Development of Conceptual, Logical & Physical data models, Data Marts, Operational Data Store (ODS), OLAP (Tabular Cubes), Dimensional Data Modeling (Star Schema & Snow-Flake) for Fact, Dimensions & Bridge Tables, starting from inception to implementation and ongoing support using Oracle PLSQL, MySQL, Teradata, DB2, Essbase Netezza using ERWIN and TOAD Data Modelers.
  • Fast learner honed with excellent communication, interpersonal, collaboration & ability to manage multiple stake holders.
  • Consultancy in solving technical issue and provide optimum solutions.
  • Assist in Estimation of Project effort.
  • Project Management - Lead and Manage Teams and Ensure Quality / On-time / In Budget delivery
  • Prepare ETL Development Guidelines, Standards and Estimation Templates and process improvements
  • Strong working experience in Waterfall SDLC models, and significant experience to Agile and Iterative SDLC models.

TECHNICAL SKILLS:

ETL Tool: Infosphere Data Stage, Informatica, Quality Stage, Oracle Pl/SQL Scripts, SSIS, ODI, AWS S3Bucket, Azure Data Factory, Data Lakes, BIG DATA technologies - Hadoop, Cloudera Impala, HIVE, Talend, SNOWFLAKE

Reporting Tool: BIRST, OBIEE, SSRS, BusinessObjects, Tableau, Microstrategy

Other Tools: Teradata SQL Assistant, MS Excel Macros, Control-M, AS400/Mainframes, ESP, control-m, Tivoli, Autosys

Languages/Script: Advanced SQL/PLSQL, T-SQL, NoSQL, Unix Shell Scripting, Python, C/C++.

Databases: Teradata 14/15/16, Oracle 11g/12c, DB2 8.2/9.7, SQL Server 2012/2014/2016 , XML, MS Access, Netezza, AWS S3Bucket

Testing Tools: HP ALM (Quality Center)

Version Control: Github, MS Visual Source Safe, CA Harvest, CVS

OS: UNIX, Windows

Management Tools: MS Project, Share Point

PROFESSIONAL EXPERIENCE:

Confidential

ETL-Lead/Developer

Responsibilities:

  • Coordinate with Business Team to Prioritize user stories for iterations
  • User Story Grooming and Discovery and Impact analysis of Upstream and Downstream
  • Subject Areas Handled - 1) SALES - Corporate and Agency - Major Revenue generator, 2) CREW details
  • Architecture, Development of ETL Datastage jobs, Testing and Implementation and Prod Support
  • Data Analysis, Data Profiling, and SQL Queries and Performance Tuning
  • Migration of Teradata Tables in AZURE CLOUD environment
  • Data Movements in Hadoop Cluster and Data querying using HIVE and other tools
  • Data Ingestion using SPARK/KAFKA
  • Fine tuning and optimization of Datastage Jobs and SSIS Procedures/Queries
  • EDI X12 files standards for processing of Contracts and Purchase orders and Processing etc.
  • Tivoli scheduler - Dynamic Trigger events like File, Database and Time-based batch submission
  • Version Handling using GitHub and Deployment using UCD
  • MainTools/Technologies involved Datastage, Hadoop, Hive, Teradata, SQLserver 2014/2016, AZURE DATAFACTORY, DATALAKE, SPARK, KAFKA, Unix Shell Scripting, TIVOLI, GITHUB

Confidential

ETL-Lead/Developer

Responsibilities:

  • Documented user requirements Translated requirements into system solutions.
  • Extensively participated in functional and technical meetings for designing the architecture of ETL load process.
  • Handling modules Funding, Accounting Services, Loss Mitigation, Collections
  • Developed Mapping Documents indicating the source tables, columns, data types, transformation required, Business rules, Confidential tables, columns and data types.
  • Designed and Developed Complex Informatica Workflows implemented IBM Information server on LINUX grid architecture.
  • Fine tuning and optimization of Informatica Workflow Jobs and SSIS Procedures/Queries
  • Developed Quality Stage components for Address cleansing and also for standardization of SSN, Customer Names etc.
  • Designed and developed Workflows in SSIS and create packages for deployment and implementation.
  • Handled dealing with vendors like RDN, WESTERN UNION, FICO, Bureaus like Equifax, Experian, Transunion.
  • Generate Data feeds for BIRST analytical software for developing cloud based services supporting BIGDATA and GRID processing connecting various DATANODES
  • Creating and Storage of HIVE tables using TALEND jobs
  • Data processing Queries using SNOWFLAKE, CLOUDERA IMPALA using HUE editor
  • Moved data from AWS S3 BUCKETS to OFSLL using s3 DataStage connector as adhoc request for reporting services
  • EDI X12 files standards for processing of Contracts and Purchase orders and Processing etc.
  • Created different Tasks in workflows which included Sessions, Commands, Worklets, Decision Task, E-mail, Event-Wait etc.
  • Error and Exceptional Handling and Processing of RDBMS tables using sqoop and store in HIVE tables for BIG DATA Processing.
  • Generated Email Notifications through scripts that run in the Post session implementations.
  • Worked with complex SQL queries, Stored Procedures, Functions, and Packages for data validation and reporting.
  • Worked on Performance Tuning of the complex transformations, mappings, sessions and SQL queries to achieve faster data loads to the data warehouse.
  • Prepared Test Cases for testing each of the interfaces which were used by the QA team.
  • Performed unit testing, system integration test as well as user acceptance testing of the developed code
  • Performed Code review to ensure that ETL development was done according to the company's ETL Standard and that ETL best practice were followed.
  • MainTools/Technologies involved Informatica, SSIS, BIRST, AWS, Hadoop, Hive, Cloudera Impala, Talend, Python, SQLserver 2014/2016, Oracle 12c, SQL/PLSQL, AZURE DATAFACTORY, DATALAKE, SNOWFLAKE, Unix Shell Scripting, Teradata 16, SQLserver 2014/2016, DB2 Mainframes, DB2 mainframes, control-m, HARVEST

Confidential, PLANO, TX

ETL Consultant/Lead

Responsibilities:

  • Data Profiling of different source systems for identifying optimal Extraction procedures
  • Closely work with Source systems Team to understand the data and complexities involved.
  • Business Rules, Data Elements Mapping, S2T mappings, ETL design
  • Assist in Development activities in Informatica.
  • Assist in Estimation of Technical work effort and Planning of resources.
  • Assist in Data modeling of Logical/Physical Data model Design.
  • Processing of EDI files especially 214/315 and RAIL INC files
  • Managing Team on Technical Front both Onsite and Offshore and resolve technical issues
  • Creating, Configuring and Fine-tuning ETL workflows designed in MS SQL Server Integration Services (SSIS).
  • Project Management of AppDev Team
  • Performance Tuning of ETL Jobs, SQL Procedures/Queries.
  • Purge procedures and Archival Strategy implementation.
  • Main Tools/Technologies involved SQL Server 2012, SSIS, Informatica, Teradata 14, Oracle SQL/PLSQL,ODI, Unix Shell Scripting, AUTOSYS,CVS

Confidential, TX

ETL Consultant/Lead

Responsibilities:

  • Raising JIRA with specifications working closely with business as it Agile based life cycle process.
  • Assist in Estimation of Technical work effort and Planning of resources.
  • End to End Impact analysis from Sources - > ETL -> Reporting
  • Business Rules and Mapping Exercise
  • Development and Testing of ETL in Oracle/PLSQL, Datastage, and Reporting Components Microstrategy.
  • Design and Built Data models, Stored Procedures, Functions, Packages, Datamarts and Cubes
  • Strategy for Exception Handling and Technical Reconciliation.
  • Processing of UDT and Webservices Transactions and store in Relational Databases for reference and further processing.
  • XML transactions from different vendors about the order, current stock and sales
  • Integrating Peoplesoft applications to OLAP processing and eventually to ESSBASE datamarts using Datastage components
  • Performance Tuning of ETL Jobs and SQL queries.
  • Error and Exceptional Handling
  • Code reviews to check against inline Standards
  • Managing Team on Technical Front
  • MainTools/Technologies involved Teradata 12/14, Oracle/PL-SQL, Oracle 9i/10g/11g, Peoplesoft, Essbase and Unix Shell Scripting scripts and Datastage to build entire ETL and Delta Extract Process.

Confidential

DWBI Consultant/Developer

Responsibilities:

  • Code walkthrough of ETL jobs.
  • Analyzing all the datamart elements.
  • Manual Running of ETL Informatica jobs.
  • Functional Testing and Performance testing before upgradation.
  • Coordinating with DBA in Oracle Upgradation.
  • Functional testing, Regression testing and Performance testing after upgradation.
  • Workflow Testing of SSIS packages and compare before and after update
  • MainTools/Technologies involved Informatica, Datastage, SSIS, Unix Shell Scripting, DB2 Mainframes, Oracle, and OBIEE.

We'd love your feedback!