We provide IT Staff Augmentation Services!

Lead\manage Etl Informatica \snowake Developers Resume

0/5 (Submit Your Rating)

FL

SUMMARY

  • Over 12 years of IT Experience with strong emphasis of Design, Development, Testing and Implementation of business application systems and delivering simplified ETL solutions for range of business requirements. Data Warehousing, Cloud data migration experience using SnowFlake, Informatica PowerCenter 10.x/9.x/8.x, Power Exchange, Informatica IICS.
  • Managing and leading team of data engineers to design, build and maintain the data architecture and infrastructure solutions to the organization.
  • Collaborating with cross - functional teams, such as data analytics, data scientists and business stake holders to understand their data needs and develop solutions that support their goals
  • Defining and managing data engineering project roadmap, ensuring that projects are delivered on time and within budget
  • Ensuring reliability, scalability and security of data infrastructure and systems, and implementing disaster recovery and business continuity plans
  • Managing data integration processes, including ETL pipelines and data migration, to ensure the smooth flow of data between systems
  • Staying up-to-date with new technologies and industry trends, and identifying opportunities to improve data engineering processes and infrastructure
  • Hiring and training data engineers, fostering a culture of innovation, collaboration and continuous learning within data engineering team
  • Extensive experience in Financials, Banking, Communications, Insurance.
  • Experienced in Installation, Configuration, and Administration of Informatica Power Center Client/Server, Informatica IICS.
  • Strong Knowledge in Relational Database Concepts, Entity Relation Diagrams, Normalization and De normalization Concepts.
  • Designed and implemented data integration between heterogeneous source systems.
  • Implemented IICS based data synchronization and data replication and mapping configuration tasks.
  • Experience in using Python programming in data transformation type activities.
  • Experience using AWS Services like EC2, S3, DMS, Lambda, Cloudformation, DynamoDB
  • Successfully implemented data migration from on-premise to Snowflake DB in the cloud from various sources
  • Worked on Performance Tuning, identifying and resolving performance bottlenecks in various levels like sources, targets, mappings and sessions.
  • Experience in support and knowledge transfer to the production team.
  • Demonstrated ability to work and communicate effectively with both business and technical audiences.
  • An excellent team member with an ability to perform individually, good interpersonal relations, strong communication skills, hardworking and high level of motivation. A quick learner with an aptitude for taking responsibilities.
  • Have also worked on Onsite - Offshore model, carried out the different phases of the projects effectively by excellent co-ordination and communication.
  • Experience in leading/mentoring team of junior developers on a variety of projects, by providing them with necessary detailed ETL solution design and helped with any questions throughout the development process.
  • Developed effective working relationships with client team to understand support requirements, develop tactical and strategic plans to implement technology solutions, and effectively manage client expectation.

TECHNICAL SKILLS

Databases\Data WareHouses: SnowFlake, Oracle12c110g,SQL Server 2016, AS/400 DB2.

Oracle Utilities: TOAD, SQL*PLUS, SQL Developer, SQL*Loader.

ETL Tools: Informatica powercenter 10.x/9.x/8.x, Informatica Power Exchange 9.x.

Data Modeling: Ralph-Kimball Methodology, Bill-Inmon Methodology, Star Schema, Snow Flake Schema, Physical And Logical Modeling, Dimension Data Modeling, Fact Tables, Dimension Tables, Normalization, Denormalization

Programming Skills: Batch, Shell Scripting (K-Shell, C-Shell), PL/SQL, Python 3.6, PySpark

Operating Systems: UNIX, Windows XP/2000

Cloud Applications: AWS, Salesforce, Snowflake

PROFESSIONAL EXPERIENCE

Confidential, FL

Lead\Manage ETL Informatica \Snowflake developers

Responsibilities:

  • Lead data engineering team and responsible for data delivery from data sourcing, data transformation to data modeling and validation
  • Managing, leading and Implementing Data warehouse applications based on specifications, design models and system workflows.
  • Supporting, troubleshooting and maintaining production systems as required.
  • Optimizing performance, resolving and providing timely follow-up on identified issues.
  • Defining and maintaining BA Mngmt Information technology development standards and best practices.
  • Involved in translating the functional requirements into technical mapping specifications and leveraged Informatica Power Center 10.x/9.x/8.x, Informatica IICS to extract and load the data from SQL Server, Oracle tables, SalesForce, Azure, AWS S3, flat files.
  • Created data pipe lines using Python, Pyspark and EMR services on AWS
  • Created AWS CLI commands to sftp files to AWS S3 buckets
  • Desgined Snowpipe for continuous data load, streams, and task scheduling on Snowflake
  • Bulk loading and unloading data into Snowflake tables using COPY command
  • Created DWH, Databases, Schemas, Tables, write SQL queries against Snowflake
  • Validate data feed from source systems to Snowflake DW cloud platform
  • Integrated and automated data workloads to Snowflake Warehouse
  • Ensure ETL/ELT’s succeeded and loaded data successfully in Snowflake DB
  • Experience in handling Batch scripts for file manipulation, extraction and execution of Workflows
  • Extensively worked on Optimization/Performance tuning to increase the throughput of the data load.
  • Involved in Unit testing, System testing to check whether the data loads into target are accurate, which was extracted from different source systems according to the user requirements.
  • Created Data Breakpoints and Error Breakpoints for debugging the mappings using Debugger Wizard.

Environment: SnowFlake, Informatica Power Center 10.x/9.x/8.x, IICS, Oracle 12c, PL/SQL, Flat files, SAP HANA, BODS, Windows, Toad, SQL Developer, Python 3.6, PySpark, Lambda, Snowflake, Batch Scripting, Tidal.

Confidential, Dallas TX

Sr ETL/Informatica Developer

Responsibilities:

  • Worked with Business analyst, Responsible in gathering Requirements and IT review. Interacted with Business Users in the design of technical specification documents.
  • Involved in creating logical and physical data models using MS Visio based on business requirements.
  • Created and monitored Database maintenance plans for checking database integrity, data optimization, rebuilding indexes and updating statistics.
  • Involved in Data transfer from OLTP systems forming the extracted sources.
  • Successfully Loaded Data into different targets from various source systems like Oracle Database, Flat files, ODS, SQL Server...etc into the Staging table and then to the target database Oracle.
  • Developed various Mappings, Mapplets, and Transformations for data marts and Data warehouse.
  • Re-designed ETL mappings to improve data quality.
  • Used Korn-Shell Scripting to automate the loading process.
  • Created Stored procedure transformations to populate targets based on business requirements.
  • Responsible for monitoring all the sessions that are running, scheduled, completed and failed. Debugged the mapping of the failed session.
  • Used Pipeline Partitioning feature in the sessions to reduce the load time.
  • Analyzed and Created Facts and Dimension Tables.
  • Used Informatica features to implement Type I, II, and III changes in slowly changing dimension tables.
  • Created Data Breakpoints and Error Breakpoints for debugging the mappings using Debugger Wizard.
  • Mapplets and Reusable Transformations were used to prevent redundancy of transformation usage and modularity.
  • Written documentation to describe program development, logic, coding, testing, changes and corrections.
  • Within specified time, projects are delivered making sure all the requirements gathering, business analysis and designing of the data marts.
  • Involved in Unit and Integration testing of Mappings and sessions.

Environment: Informatica Power Center 8.6.1, Oracle 11i, MS SQL Server 2008, MS Visio, Toad, Power exchange 8.1, Business objects XiR3,ERWIN 4.2, Control-M.

Confidential, Dallas, TX

ETL Developer

Responsibilities:

  • Member of core ETL team involved in gathering requirements, performing source system analysis and development of ETL jobs to move data from the source to the target DW.
  • Analyzed the business requirement document and created functional requirement document mapping for all the business requirements.
  • Designed and Developed ETL mappings using transformation logic for extracting the data from various sources systems.
  • Developed complex ETL mappings making use of transformations like Source Qualifier, Joiner, Update Strategy, Connected Lookup and unconnected Lookup, Rank, Expression, Java, Java Expressions, Router, Filter, Aggregator and Sequence Generator transformations.
  • Automated the load process using UNIX shell scripts
  • Used parallel processing capabilities, Session-Partitioning and Target Table partitioning utilities.
  • Created reusable objects in Informatica for easy maintainability and reusability.
  • Performed the data validations and control checks to ensure the data integrity and consistency.
  • Extensively used debugger to trace errors in the mapping.
  • Extensively involved in coding of the Business Rules through PL/SQL using theFunctions, Packages, Cursors and Stored Procedures
  • Involved in developing test plans and test scripts to test the data based on the business requirements.
  • Created source, target, transformations, sessions, batches and defined schedules for the sessions.
  • Re-designed ETL mappings to improve data quality.
  • Developed standard and re-usable mappings and mapplets using various transformations like expression, aggregator, joiner, source qualifier, router, lookup Connected/Unconnected, and filter.
  • Created Workflows and used various tasks like Email, Timer, Scheduler, Control, Decision, and Session in the workflow manager.
  • Modifying the shell/Perl scripts as per the business requirements.
  • Implemented performance tuning of Sources, Targets, Mappings and Sessions by identifying bottlenecks and used Debugger to debug the complex mappings and fix them.
  • Partitioned sessions for concurrent loading of data into the target tables.
  • Prepared ETL mapping Documents for every mapping and Data Migration document for smooth transfer of project from development to testing environment and then to production environment.
  • Used Informatica features to implement Type I, II, and III changes in slowly changing dimension tables.
  • Used Shell Scripting to automate the loading process.
  • Actively involved in production support. Implemented fixes/solutions to issues/tickets raised by user community.
  • Co-ordinate between different teams across circle and organization to resolve release related issues.

Environment: Informatica Power center 8.6/8.1, Oracle 10g, DB2, SQL *Plus, Flat Files, Windows, IBM UNIX(AIX),TOAD.

Confidential, Manchester, CT

Programmer Analyst

Responsibilities:

  • Create ETL components for data conversion using Informatica 8.6.0.
  • Analyze source and target Schemas.
  • Create mappings and flow charts using Informatica 8.x development, code review.
  • Develop SQL statements and programs on source and target databases.
  • Understand data conversion process.
  • Create root cause analysis and resolution to fix defects

Environment: Informatica Power center 8.6/8.1, Oracle 10g

We'd love your feedback!