We provide IT Staff Augmentation Services!

Senior Etl Developer Resume

4.00/5 (Submit Your Rating)

Berlin, CT

SUMMARY

  • Sr. SQL and ETL Informatica/SSIS expert with around 9 Years’ experience in IT requirement gathering, design, development, testing and support with extensive work in Business Intelligence / Data warehousing technologies.
  • Extensive experience of providing IT services in Insurance, Utilities and Banking domain.
  • Extensive experience in developing ETL applications and statistical analysis of data on databases Oracle, DB2, MySQL and SQL Server.
  • Superior SQL skills and ability to write and interpret complex SQL statements and also skillful in SQL optimization and ETL debugging and performance tuning
  • Strong experience in Data Analysis, Data Migration, Data Cleansing, Transformation, Integration, Data Import, and Data Export through the use of multiple ETL tools such as Informatica Power Center, SSIS.
  • Experience in developing of on line transactional processing (OLTP), operational data store (ODS) and Data Warehouse databases.
  • Extensive knowledge in SQL (DDL and DML) Queries to improve the database performance and availability.
  • Have good skills on understanding and development of business rules for its Standardization, Cleanse and Validation of data in various formats.
  • Adept at writing Data Mapping Documents, Data Transformation Rules and maintaining Data Dictionary with Interface requirements documents.
  • Created document and maintained logical and physical database models in compliance with enterprise standards and maintained metadata definitions for enterprise data stores within a metadata repository.
  • Strong experience in using Excel and MS Access to dump the data and analyze based on business needs.
  • Experience in working with Dimension Tables, Fact tables, Slowly Changing Dimensions, DataMart’s and Dimensional modeling schemas.
  • Experience in Data modeling; Dimensional modeling and E R Modeling and OLTP AND OLAP in Data Analysis. Very familiar with SCD1 and SCD2 in snowflake schema and star schema.
  • Experience in Extraction, Transformation and Loading (ETL) data from various data sources into Data Marts and Data Warehouse using Informatica PowerCenter components (Repository Manager, Designer, Workflow Manager, Workflow Monitor and Informatica Administration Console).
  • Strong Experience in developing Sessions/Tasks, Worklets and Workflows using Workflow Manager tools - Task Developer, Workflow & Worklet Designer.
  • Experience in performance tuning of Informatica mappings and sessions to improve performance for the large volume projects.
  • Very good Experience in using SSIS tools like Import and Export Wizard and SSIS Package Designer.
  • Experience in debugging mappings, identified bugs in existing mappings by analyzing the data flow and evaluating transformations.
  • Good experience in writing UNIX shell scripts, SQL scripts for development, automation of ETL process, error handling and auditing purposes.
  • Exposure to Microsoft Azure Cloud database and high availability configurations.

TECHNICAL SKILLS

ETL: Informatica PowerCenter 9.6, 8.6 ; SSIS

Data Warehouse Concepts: ODS, Star Schema, Snowflake schema, OLTP, OLAP

Scheduling Tools: Autosys, Control M

RDBMS: SQL Plus,DB2, Oracle 11g/12c, SQL Server 2008/2012, MySQL

Data Modeling: ERWin 7.5, Entity Relationship and Dimensional (Star, Snowflake Schema)

UNIX: Shell scripting

Reporting Tools: Tableau 9, Cognos BI 8x/9x

Defect Tracking Tools: HP Quality Center, Bug Tracker

Operating Systems: Windows XP/2000/9x/NT, UNIX

Cloud Computing: Amazon Web Services (AWS), S3, Redshift

Other Tools: Notepad++, Toad, SQL Navigator, Microsoft Excel,MS Visio, Microsoft Visual Studio 2015, Power Point

PROFESSIONAL EXPERIENCE

Confidential, Berlin, CT

Senior ETL Developer

Responsibilities:

  • Translated business requirements into SQL query to implement performance optimization on advertiser side.
  • Created Data Mapping Documents as per business rules applied in source data to target data.
  • Responsible for creating the Design document and design flow of the data in the project.
  • Experience in Extraction, Transformation and Loading (ETL) data from various data sources into Data Marts and Data Warehouse using Informatica power center components (Repository Manager, Designer, Workflow Manager, Workflow Monitor and Informatica Administration Console).
  • Strong Experience in developing Sessions/Tasks, Worklets and Workflows using Workflow Manager tools Task Developer, Workflow & Worklet Designer.
  • Generated ad hoc SQL Plus queries using joins, Database connections and transformation rules to fetch data from Oracle database and SQL Server.
  • Data expert and SME in multiple areas that includes Consolidated Data repository, Customer Data Warehouse, IVR data repository and Metering Database.
  • Reviewed normalized/Denormalization schemas for effective and optimum performance tuning queries and data validations in OLTP environments.
  • Responsible for developing the mapping, session and workflow as per the dataflow and load the data in Oracle database.
  • Solely responsible for migration of the ETL code to different environments - Dev, Test and Prod.
  • Created the DDL scripts and implemented in different environments.
  • Worked on Informatica Power Center and used different transformations - Filter, Router, Joiner, Update Strategy (Static and Dynamic), Aggregator.
  • Created parameter files and passed connection string and parameter values using parameter files.
  • Worked on Unix shell scripting to transfer the files to different server locations.
  • Validated and tested the mappings using Informatica Debugger, Session Logs and Workflow Logs.
  • Developed User Defined Functions and Created Materialized Views for high availability of data.
  • Developed mapping to load Fact and Dimension tables, for type 1 and type 2 dimensions and incremental loading and unit tested the mappings.
  • Scheduled and ran Workflows in Workflow Manager and monitored sessions using Informatica Workflow Monitor.
  • Used ETL (SSIS) to develop jobs for extracting, cleaning, transforming and loading data into Azure cloud DB.
  • Designed SSIS Packages to transfer data from flat files to SQL Server using Business Intelligence Development Studio.
  • Extensively used SSIS transformations such as Lookup, Derived column, Data conversion, Aggregate, Conditional split, SQL task, Script task and Send Mail task etc.
  • Created packages in SSIS packages with control flow, data flow.
  • Worked on the connection to SSIS to Autosys scheduling tool to schedule the packages and run the packages.
  • Migrated the data from on-premise Oracle database to Azure Cloud database using SSIS packages.
  • Azure cloud is a HA configured (high availability) database available in East and Central regions with SLA-95.5%. Any downtime or performance issues were worked with Microsoft and fixed immediately.
  • Worked on parameterizing the SSIS connection to Cloud Central and East Database regions.
  • Supported the project in different phases - Dev, SIT, UAT.
  • Developed and modified jil files for Informatica workflows and SSIS packages and scheduled in Autosys scheduling tool.
  • Monitored the incremental data loads and scheduled jobs in Autosys in different phases of the project.
  • Supported the project during warranty phase and fixed critical issues in time.

Environment: & Tools: SSIS, SQL Plus, Informatica PowerCenter 9.6, Oracle, UNIX Shell Scripting, SQL Server, Microsoft Azure Cloud SQL Server Database, Autosys Scheduling Tool, File Zilla, Putty, Microsoft Import & Export Wizard, AQT, SQL Developer, HP Quality Center.

Confidential

ETL Informatica Developer

Responsibilities:

  • Review SQL queries, Create Data Mapping Documents and work with Data Modeling team for analysis and documentation.
  • Analyze business requirements, technical specification, source repositories and physical data models for ETL mapping and process flow.
  • Coordinate and develop all documents related to ETL design and development.
  • Involved in designing the Data Mart models with ERwin using Star schema methodology.
  • Used repository manager to create repository, user’s groups and managed users by setting up privileges and profile.
  • Used debugger to debug the mapping and correct them.
  • Performed Database tasks such as creating database objects (tables, views, procedures, functions).
  • Responsible for debugging and performance tuning of targets, sources, mappings and sessions.
  • Optimized the mappings and implementing the complex business rules by creating re-usable transformations and Mapplets.
  • Worked extensively with mappings using expressions, aggregators, filters, lookup, joiners, update strategy.
  • Worked on mapping to load Fact and Dimension tables for SCD - Type 1 and Type 2 and incremental loading and unit tested the mappings.
  • Experience in real time Web Services which performs a lookup operation using key column as input and provided response with multiple rows of data belonging to key.
  • Created Workflows, tasks, database connections, FTP connections using workflow manager.
  • Responsible for identifying bugs in existing mappings by analyzing data flow, evaluating transformations and fixing bugs.
  • Created UNIX shell scripting for automation of ETL processes. Used UNIX for check in’s and check outs of workflows
  • Automated ETL workflows using Control-M Scheduler. Used Informatica workflow manager for creating, running the Batches and Sessions and scheduling them to run at specified time.
  • Executed sessions, sequential and concurrent batches for proper execution of mappings and set up email delivery after execution.
  • Preparing Functional Specifications, System Architecture/Design, Implementation Strategy, Test Plan and Test Cases.
  • Implemented and documented all the best practices used for the data warehouse.
  • Improving the performance of the ETL by indexing and caching.
  • Created Staging, Dimension and Fact tables using different transformations and loaded the data into oracle database.
  • Responsible for creating reject tables (Data is populated here whenever there is a violation in FACT).
  • Involved in production deployment and later moved into warranty support until transition to production support team.
  • Created Informatica mapping which read data from HDFS and populated in PostgreSQL DB.
  • Wrote SparkSQL to read parquet files for data quality and validation with HDFS.
  • Used Hive/Impala for multiple purpose like adhoc reporting, data validation and building data quality report
  • Exposed to PowerExchange for Hadoop to extract data from HDFS or load data to HDFS/Hive.
  • Exposed to Spark Transformations and Actions - map, mapPartitions, groupby, groupbyKey, aggregratebyKey, reducebyKey. Worked on moving the files to HadoopHDFS location and analyze the data in the system using spark Transformation and actions.
  • Experience in monitoring and reporting issues for the Daily, weekly and Monthly processes. Also, work on resolving issues on priority basis and report it to management.
  • Enhanced reports using Cognos Report Studio based on changes made in source.
  • Worked on metadata modelling using Framework manager and published packages.
  • Testing the different functionalities of reports which included formatting, prompts, conditional variables, scheduling and saving report to file-system.

Environment: Erwin Data Modeler, Hadoop, Hive, SparkSQL, Spark Transformation, Informatica PowerCenter 9.6, Control M, Oracle, MS SQL Server, JIRA, Putty, HPQC

Confidential

Informatica Developer/Data Analyst

Responsibilities:

  • Involved in analysis of Business requirement, Design and Development of High level and Low level designs, Unit and Integration testing
  • Worked on getting the requirements, developing, testing and support during various conversions.
  • Interacted with clients and users to understand their requirements and provided solutions to meet their requirements.
  • Analyzed the system thoroughly and Created System Document of a complex system without any input/document which helped us to get the project from competitors.
  • Participated in client discussions to gather scope information and perform analysis of scope information to provide inputs for project scoping documents
  • Used UNIX scripting for file formatting before it gets into Informatica.
  • Based on business needs prepared conceptual data models for long - term solutions, created logical and physical data models using best practices to ensure high data quality.
  • Optimized and update logical and physical data models to support new and existing design requirement.
  • Data Vault used in both a data loading technique and methodology which accommodates historical data, auditing, and tracking of data.
  • Data governance functional and practical implementation and also responsible for designing common Data governance frameworks
  • Provided inputs for overall implementation plan, lead deployment of applications/infrastructure and post production support activities.
  • Good domain knowledge in Insurance and claims.
  • Developed complex modules and delivered defect free and highly optimized deliverables
  • Trained and groomed new resources on domain as well as technical knowledge and all processes

Environment: Informatica 9.5/8.6, Erwin, Oracle, Flat files, SQL, Toad, Cognos 8, Control M

We'd love your feedback!