We provide IT Staff Augmentation Services!

Etl Developer/data Engineer Resume

4.00/5 (Submit Your Rating)

SUMMARY:

  • Around 7 years of IT experience in the designing and implementing of Data Warehousing and Data Integration including Requirement Gathering, Data Analysis, Design, Development, Testing and Production.
  • Involved in all phases of the SDLC (Software Development Life Cycle) from analysis, design, development, testing, implementation, and maintenance with timely delivery against aggressive deadlines.
  • Worked extensively on ETL process using Informatica Power Center 9.x,10.02.
  • Extensively used ETL processes to load data from various source systems such as Oracle, Flat Files, XML files into target system by applying business logic on transformation mapping for inserting and updating records when loaded.
  • Hands on experience with S3, EC2, RDS, EMR, Redshift, Glue and other services of AWS.
  • Experience in creating and maintainingAWS user accounts, security groups, VPC, Route 53, RDB, SNS, SESandAWS storage services like S3, EBS.
  • Worked on different data formats such as CSV, JSON and Parquet, ORC, Text, Avro files.
  • Experienced in handling various file formats like AVRO, Parquet, ASCII, XML, JSON.
  • Experience in RDBMS databases systems like Oracle 10g/9i, Oracle 12c, SQL Server and My SQL.
  • Experience with dimensional modeling using star schema and snowflake models.
  • Created UNIX shell scripts to run theInformaticaworkflows and controlling theETLflow.
  • Troubleshootproductiondata issues and failures, manage critical incidents and bridge calls, automate repetitive tasks, problem analysis, maintain knowledge base and documentation, implement process improvement methods.
  • Proficient in using testing tools such as HP QC/ALM, RALLY, QTP/UFT, Load runner, SOAPUI and JIRA.
  • Hands onExperienceof testing bothSOAPandRESTbasedWeb Services.
  • Knowledge ofInformaticaadministration in windows and Linux environment.
  • Involved in the development ofInformaticamappings and Mapplets and also tuned them for optimum performance.
  • Experiencein Waterfall and Agile Methodology.

TECHNICAL SKILLS

Data Warehousing: Informatica Power Center 10.2,10.1, 9.x, TransformationDeveloper, Mapplets Designer, Mapping Designer, Workflow Manager, Workflow Monitor, Warehouse Designer, andInformaticaServer)..

Database: Snowflake,DB2, Oracle 12C/11gMS SQL Server 2016/2014/2012/2008, DB2.

Languages: SQL, PL/SQL, C, XML,Python,COBOL

Scheduling Tools: TWS (Tivoli Workload Scheduler), Control - M.

AWSCloud Tools: EC2, Elastic Load-balancers, Elastic Container Service (Docker Containers), S3, Elastic Beanstalk, Cloud Front, Elastic File System, RDS, Dynamo DB, DMS, VPC, Direct Connect, Route53, Cloud Watch, Cloud Trail, Cloud Formation, IAM, EMR ELB, RDS, AM

Applications: MS Office, MS Project, Toad 10.6/9.2, Oracle SQL Developer.

Processes: Agile Process, REF (Release Entry Framework).

PROFESSIONAL EXPERIENCE

Confidential

ETL Developer/Data Engineer

Environment: InformaticaPower Center 10.02, Oracle Developer, Unix, DB2, SharePoint, Jira.

Responsibilities:

  • Analyze the source systems and identify data that needs to be converted to Target.
  • Apply appropriate data conversion rules to load data into Target.
  • Designing, developing, and maintaining new data ingestion processes.
  • Source to Target mapping to load data into Data warehouse.
  • Design, develop and support Scalable ETL components to aggregate and move data from a variety of structured and unstructured data sources to the data warehouse/Snowflake.
  • AWS Redshift, S3, EC2 services to query large amount data stored on S3 to create a Virtual Data Lake without having to go through ETL process.
  • Support data pipelines and automate, control and monitor data movements and deploy appropriate control mechanisms for alerts, traceability, data validation, archival.
  • Responsibilities included source system analysis, data transformation, loading, validation for data marts, operational data store and data warehouse
  • Used debugger to debug the mapping and correct them.
  • Part of team for debugging and performance tuning of targets, sources, mappings and sessions.
  • Worked on Informatica Power Center tool - Source Analyzer, Data warehousing designer, Mapping & Mapplet Designer and Transformation Designer.
  • Performed the performance and tuning at source, Target levels using Indexes, Hints and Partitioning in DB2, ORACLE and Informatica.
  • Designed and developed various PL/SQL stored procedures to perform various calculations related to fact measures.
  • Converted the PL/SQL Procedures to Informatica mappings and at the same time created procedures in the database level for optimum performance of the mappings.
  • Investigating and fixing the bugs occurred in the production environment and providing the on-call support.
  • Performed Unit testing and maintained test logs and test cases for all the mappings.
  • Maintained warehouse metadata, naming standards and warehouse standards for future application development.
  • Parsing high-level design specification to simple ETL coding along with mapping standards.

Confidential, Austin, TX

ETL/ Informatica Developer/Business Analyst

Environment:InformaticaPower Center 10.01, Oracle SQL Developer, SQL SERVER, VSTS, SharePoint, Jasper soft.

Responsibilities:

  • Facilitating meetings with the business subject matter experts or client toformulate business requirementsand translating client needs into actionable system requirements.
  • Specifying andanalyzing business requirements, functional requirements, system interface requirements, data requirements, non-functional requirements.
  • Working with Data Modelers and Data Analysts in assuring requirements is traceable between thesource data and target tables/entitiesin the respective data models.
  • Ensuring that legacy systemdata is extracted, delivered, mappedin a manner that will meet current and future functional requirements of solution.
  • Developinghigh quality documentationincluding business requirements documents, use cases, software requirements specifications, conceptual page flows and business process models.
  • Working extensively withVSTS (Azure DevOps) in extracting the reports(weekly, Ad-hoc, PI planning, PI achievement reports), tracking user stories and providing project status.
  • Worked with the Data Modeler, Data Architect, and Database Developers to implement data model designs in SQL and Azure PaaS SQL, handling reverse engineering of the existing data flow, transformation logic from SSIS, SSAS and Excel.
  • Designing, developing and maintaining new data ingestion processes.
  • Involved in building the ETL architecture and Source to Target mapping to load data into Data warehouse.
  • Responsibilities included source system analysis, data transformation, loading, validation for data marts, operational data store and data warehouse.
  • Used debugger to debug the mapping and correct them.
  • Part of team for debugging and performance tuning of targets, sources, mappings and sessions.
  • Code walks through with team members.
  • Worked with Business Analysts and end users to correlate Business Logic and Specifications forETL.
  • Actively participated in gathering the requirement documents, analyzing, designing and development.
  • Created mapping documents to outline data flow from sources to targets.
  • Extensively worked withInformaticaPower Center.
  • Tuned SQL queries for better performance.
  • Desire to learn new skills, technologies, and adapt to new information demands.

Confidential, Charlotte, NC

ETL/ Informatica Developer

Environment:InformaticaPower Center 10.1,InformaticaPower Exchange,InformaticaData Quality, Oracle 12c, Flat files, Toad, Business Objects, Gnosis, Putty, Unix, SOAP REST web services, HP QC/ALM.

Responsibilities:

  • Involved in creating technical design documents, source to target mapping documents and test case documents to reflect ETL process.
  • Gathered requirements and created functional and technical design documents study, business rules, data mapping and workflows.
  • Involved in building the ETL architecture and Source to Target mapping to load data into Data warehouse.
  • Created mapping documents to outline data flow from sources to targets.
  • Involved in Dimensional modelling (Star Schema) of the Data warehouse and used Erwin to design the business process, dimensions and measured facts.
  • Coordinate weekly QA andproductionrelease activities.
  • Responsibilities included source system analysis, data transformation, loading, validation for data marts, operational data store and data warehouse.
  • Developed Mapplets, reusable transformations, source and target definitions and mappings usingInformatica.
  • Used heterogeneous files from Oracle, Flat files as source and imported stored procedures from oracle for transformations.
  • Used Dimensional Modelling Techniques to create Dimensions, Cubes and Fact tables.
  • Written PL/SQL procedures for processing business logic in the database. Tuned SQL queries for better performance.
  • Scheduled Sessions and Batch Process based on demand, run on time, run only once usingInformaticaServer Manager.
  • Generated completion messages and status reports usingInformaticaServer manager.
  • Tuned ETL procedures and STAR schemas to optimize load and query Performance.
  • Starting Sessions and Batches and make Event based scheduling.
  • Managed migration in a multi-vendor supported Server and Database environments.

Confidential, Los Angeles, CA

ETL/ Informatica Developer

Environment:InformaticaPower Center 9.6, Oracle 10g/9i, XML, Flat files, SQL, PL/SQL, Control-M, Windows, UNIX, Business Intelligence, JIRA.

Responsibilities:

  • Extensively usedInformaticaPower center, Power Exchange for extracting, transforming and loading data from relational sources and non-relational sources.
  • Extensively used various transformations such as Source Qualifier, Expression, Lookup, Sequence Generator, aggregator, Update Strategy, and Joiner while migrating data from various heterogeneous sources like Oracle, DB2, XML and Flat files to Oracle.
  • Developed Mappings using Designer to extract, transform data according to the requirements and loaded into database.
  • Handled slowly changing dimensions of Type 2 to populate current and historical data to Dimensions and Fact tables in the data warehouse.
  • CreatedJIRAprojects, templates, workflows, screens, fields and other administrative activities.
  • According to transformation logic, we used various transformations like sorter, aggregator, lookup, Expression, filters, Update Strategy, Joiner, Router in Mappings.
  • Designed basic UNIX scripts and automated it to run the workflows daily, weekly and Monthly.
  • Scheduled Sessions and Batches on theInformaticaServer usingInformaticaworkflow Manager.
  • MigrateInformaticaobjects and Database objects to Integration Environment and schedule using theControl-M.
  • Monitoring theETLjobs and fixing the Bugs.
  • Involved in doing Unit Testing, Integration Testing and System Testing.
  • Identified and created various classes and objects for report development.

Confidential

ETL/InformaticaDeveloper

Environment:InformaticaPower Center 9.x, Oracle 9i, MS SQL SERVER 2000, SQL, PL/SQL, SQL*Loader, UNIX Shell Script.

Responsibilities:

  • Extracted Data from Different Sources by usingInformatica.
  • Extensively usedInformaticaclient tools Source Analyzer, Warehouse designer, Mapping designer and Mapplets Designer.
  • Extracted data from different sources of databases. Created staging area to cleanse the data and validated the data.
  • Designed and developed complex Aggregate, expression, filter, join, Router, Lookup and Update Transformation rules.
  • Developed schedules to automate the update processes andInformaticasessions and batches.
  • Analyze, design, construct and implement theETLjobs usingInformatica.
  • Developed mappings/Transformations/mapplets by using mapping designer, transformationdeveloperand mapplets designer inInformaticaPower Center 6.1.
  • Developed Shell scripts to setup runtime environment, and to run stored procedures, packages to populate the data in staging tables.
  • Created Users, user groups, database connections and managed user privileges using supervisor.

We'd love your feedback!