Etl Developer Resume
5.00/5 (Submit Your Rating)
Mclean, VA
PROFESSIONAL SUMMARY:
- ETL Informatica Developer with 4 years of total IT experience. Operating as an ETL Developer in a wide variety of projects and is skilled in analysis/design/implementation surrounding package enabled business transformation ventures.
- Superior SQL skills and ability to write and interpret complex SQL queries and skillful in SQL optimization and ETL debugging and performance tuning.
- Experience in developing of on - line transactional processing (OLTP), operational data store (ODS) and decision support system (DSS) (e.g., Data Warehouse) databases.
- Strong Data Warehousing experience in Application development & Quality Assurance testing using Informatica Power Center 10.2/9.6/9.1/8.6 Source Analyzer, Data Warehousing Designer, Mapping Designer, Mapplet Designer, Transformation developer), Informatica Power Mart, Power Connect 6.2, Power Exchange, OLAP, OLTP.
- Skilled at configuring OBIEE Metadata Objects including repository, variables, interactive dashboards, and reports.
- Proficient in the Integration of various data sources with multiple relational databases like Oracle 12c/11g/10g/9i, MS SQL Server 2014/2012/2008 , DB2 8.0/7.0, Teradata, VSAM files and Flat Files into the staging area, ODS, Data Warehouse and Data Mart.
- Expertise in creating Schema Objects (Attributes, Facts, Hierarchies, Transformations) and Public Objects (Metrics, Filters, Prompts, Custom Groups, Consolidations, Templates) to develop documents and reports. Have used Level Metrics, Conditional Metric, Advance Filter (using Apply Simple) and Advanced Prompt.
- Experience Data migration from existing data stores and Developing capacity plans for new and existing systems.
- Knowledge of Medicaid and Medicare Services. CMS, Health Assessment Systems, HIPAA, PPACA (Patient Protection and Affordable Care Act), Compliance issues, Confidential INC and SNOMED Mapping, ICD 9, Electronic Health Records, Electronic Medical Records.
- Extensive Experience in developing and performance tuning of Informatica mappings.
- Highly proficient in the use of SQL for developing complex Stored Procedures, Triggers, Tables, Views, User Defined Functions, Relational Database models and Data integrity, and SQL joins.
- Created Indexes, Indexed Views in observing Business Rules and creating Effective Functions and appropriate Triggers to assist efficient data manipulation and data consistency.
- Knowledge with creation of Indexes, Indexed Views in observing Business Rules and creating Effective Functions and appropriate Triggers to assist efficient data manipulation and data consistency.
- Worked in creating Reusable Transformations (Joiner, Sorter, Aggregator, Expression, Lookup, Router, Filter, Update Strategy, Sequence Generator, Normalizer and Rank) and Mappings using Informatica Designer and processing tasks using Workflow Manager to move data from multiple sources into targets.
- Strong knowledge of Entity-Relationship concept, Facts and dimensions tables, slowly changing dimensions and Dimensional Modeling (Star Schema and Snow Flake Schema).
- Technical expertise in designing technical processes by using Internal Modeling & working with Analytical Teams to create design specifications; successfully defined & designed critical ETL processes, Extraction Logic, Job Control Audit Tables, Dynamic Generation of Session Parameter File, File Mover Process, etc.
- Worked in T-SQL coding and testing: functions, views, triggers, cursors, dictionary, stored procedures etc.
- Strong experience in providing end-to-end business intelligence solution by configuring metadata and building business object reports using Info-View web intelligence and Universe designer.
- Ability to meet deadlines and handle multiple tasks, flexible in work schedules and possess good communication skills.
EXPERIENCE:
ETL Developer
Confidential, McLean, VA
Responsibilities:
- Involved in Designing Informatica mappings and creating the mappings from the scratch as per the requirement.
- Helped IT reduce the cost of maintaining the on-campus Informatica PowerCenter servers by migrated the code to Informatica Cloud Services.
- Worked with Informatica Cloud to create Source /Target connections, monitor, synchronize the data in SFDC.
- Was involved in extracting the data from the Flat Files and Relational databases into staging area.
- Documented technical specification, business requirements, functional specifications for the development of Informatica Extraction, Transformation and Loading (ETL) mappings to load data into various tables and defined ETL standards.
- Designed and created data extracts, supporting SSRS, POWER BI, Tableau or other visualization tools reporting applications.
- Worked on all types of transformations that are available in Power bi query editor.
- Explored data in a variety of ways and across multiple visualization using Power BI. Strategic expertise in design of experiments, data collection and analysis.
- Analyzed and validated the business model in OBIEE Administration Tool by performing consistency check, validating logical source tables, logical columns, and validating repository level calculations done in the business model layer.
- Created different types of Customized Reports Drilldown, Aggregation to meet client requirements. Built Business model and established relationships Foreign Keys Physical Logical between tables. Created business reports using BI Answers as per requirements. Generated various ad-hoc Reports and working with OBIEE 11g dashboard developer.
- Created/Modified Metadata repository RPD by using OBIEE Admin tool: Physical, Business Model and Mapping, and Presentation Layers.
- Written and executed complex T-SQL queries using SQL Server Management Studio for back end data validation testing.
- Implemented Slowly Changing Dimensions - Type I & II in different mappings as per the requirements.
- Developed mappings in Informatica to load the data from various sources into the Data Warehouse, using different transformations like Joiner, Aggregator, Update Strategy, Rank, Router, Lookup, Sequence Generator, Filter, Sorter and Source Qualifier.
- Written complex SQL Query as per business requirement to fetch the data from relational tables.
- Documented technical specification, business requirements, functional specifications for the development of Informatica Extraction, Transformation and Loading (ETL) mappings to load data into various tables and defined ETL standards. Documented test cases and test plans data for the mappings developed.
- Performance tuning of the process at the mapping level, session level, source level, and the target level.
- Implemented various new components like increasing the DTM Buffer Size, Database Estimation, Incremental Loading, Incremental aggregation, Validation Techniques, and load efficiency.
- Used SQL tools like TOAD to run SQL queries to view and validate the data loaded into the warehouse.
- Extracted the Salesforce CRM information into BI Data Warehouse using Force.com API/ Informatica on Demand to provide integration with oracle financial information to perform advanced reporting and analysis.
- Created Stored Procedures to transform the Data and worked extensively in T-SQL, PL/SQL for various needs of the transformations while loading the data into Data warehouse.
- Worked in Teradata utilities like Multi Load, Fast Load, Fast Export, BTEQ, TPUMP, TPT and tools like SQL Assistant, Viewpoint.
- Assisted in the oversight for compliance to the Enterprise Data Standards, data governance and data quality.
- Created organized, customized analysis and visualized projects and dashboards to present to Senior Level Executives.
ETL Developer
Confidential, Sacramento, CA
Responsibilities:
- Worked on multiple projects using Informatica as ETL tool to extract data from IBM DB2, Oracle11and flat file systems performed massive data cleansing applied all the business rules and loaded into Target system.
- Worked with Business Users and Business Analyst for requirement gathering and business analysis.
- Extensively worked with Data Analyst and Data Modeler to design and to understand the structures of the Fact and Dimension tables and Source to Target Mapping.
- Administered & scheduled batches/sessions using server manager. Modified and Migrated the Informatica mappings (XML) from the Development to Test Server and Production server.
- Developed complex SQL queries using stored procedures, common table expressions (CTEs), temporary table to support Power BI and SSRS reports.
- Designed and developed Power BI graphical and visualization solutions with business requirement documents and plans for creating interactive dashboards.
- Involved in designing real-time Data warehouse schema and created logical and physical data model while the dependency constraints and index were on OLAP tables.
- Worked closely with Users, Developers and Administrators to resolve ongoing Production Problems by reviewing design changes made to production systems and made corresponding changes to OBIEE Repository.
- Extensively used mapplets and Reusable Transformations for reusability of mapping Logic.
- Used Stored Procedures to implement complex business logic in that basically writing PL/SQL stored Procedures.
- Worked with Session Logs and Workflow Logs for Error handling and troubleshooting in DEV environment.
- Created SSIS Packages to extract data from Excel Files, MS Access files using Pivot Transformation, Fuzzy Lookup, Derived Columns, Condition Split, Term extraction, Aggregate, Execute SQL Task, Data Flow Task, Execute Package Task to generate underlying data for the reports and to export cleaned data from Excel Spreadsheets, Text file, MS Access and CSV files to data warehouse.
- Extracted data from different flat files, MS Excel, MS Access and transformed the data based on user requirement using Informatica Power Center and loaded data into target, by scheduling the sessions.
- Executing test scripts to verify actual results against expected results by using Power Connect for source (DB2) validation and Oracle for target validations.
- Involved in extensive performance tuning by determining bottlenecks at various points like targets, sources, mappings, sessions or system. This led to better session performance.
- Extensively used Informatica debugger to figure out the problems in mapping. Also involved in troubleshooting existing ETL bugs.
- Used Informatica Power Center for extractions, transformation, and loading data from heterogeneous sources into the target databases.
Data Analyst
Confidential
Responsibilities:
- Developed several mappings to load data from different source systems into Enterprise warehouse.
- Worked with end users to gather the Functional Specifications.
- Identification of risks in schedule and marking them in the Used Cases.
- Extensively used Star Scheme methodologies in building and designing the logical data model into Dimensional Models.
- Scheduled multiple brain storming sessions with DBA s and production support team to discuss about views, partitioning and indexing schemes case by case for the facts and dimensions.
- Involved in analyzing existing logical and physical data modeling with Star and Snowflake schema techniques using Erwin in Data warehouse.
- Designed the Data Warehousing ETL procedures for extracting the data from different source systems to the target system.
- Worked with project team representatives to ensure that logical and physical ER/Studio data models were developed in line with corporate standards and guidelines.
- Involved in Analysis, Requirements Gathering and documenting Technical specifications.
- Designed and optimized the Mapping to load the data in slowly changing dimensions.
- Optimized high volume tables in Teradata using various join index techniques, secondary indexes, join strategies and hash distribution methods.
- Created Data Quality Scripts using SQL and Hive to validate successful data load and quality of the data. Implemented Slowly Changing Dimensions - Type I & II in different mappings as per the requirements.
- Involved in design and loading of hash current tables to capture change records (Determine insert else update).
- Created, scheduled, and monitored workflow sessions based on execute on demand, scheduled on time, using Informatica PowerCenter workflow manager.
- Created all reusable M like Oracle, Netezza and flat files and loaded into the target Netezza database
- Data quality analysis and execution of the Data Quality Management (DQM) package.
- Developed mappings to create and update parameter files with updated parameter values before starting the load.
- Generated tableau dashboards for sales with forecast and lines and with the help of combination charts for clear understanding.