We provide IT Staff Augmentation Services!

Sr. Etl Informatica/ Database Developer Resume

3.00/5 (Submit Your Rating)

Oakland, CA

PROFESSIONAL SUMMARY:

  • 8 years of experience in developing and designing of Data warehouse, Data migration, Identity resolution and data quality projects using Informatica products (Informatica Power Center 7x/8x/9x, Informatica Power Exchange and Informatica Data Quality ).
  • Extensive programming experience in Oracle, Teradata, MS SQL Server, MYSQL, Netezza and DB2.
  • Extensive reporting experience in OBIEE.
  • Good knowledge of Hadoop ecosystems, HDFS, Big Data.
  • Knowledge on ecosystems like Hive, Pig, Sqoop, NoSQL, Map Reduce and Hbase.
  • Strong knowledge of Hadoop and Hive and Hive’s analytical functions.
  • Extracted/loaded data from/into diverse source/target systems like Oracle, SQL Server, MY SQL, DB2, Teradata, Netezza, Sales Force, SAP, @Task and Flat Files.
  • Experienced in writing SQL Statements, PL/SQL code for the database objects such as tables, views, indexes, sequences, procedures, functions, packages, and triggers
  • Good back end programming experience using PL/SQL, SQL, MS SQL, Stored Procedures, Functions, Ref Cursors, Constraints, Triggers, Indexes - B-tree Index, Bitmap Index, Views, Inline Views, Materialized Views, Database Links, Export/Import Utilities, Ad-hoc queries using SQL.
  • Expertise in QA Testing in distributed Unix/Windows Environment and Oracle/SQL Server/Teradata /DB2 databases as back end, Performed end-to-end testing.
  • Extensively written test scripts for back-end validations.
  • Basic knowledge on Hadoop echo systems (HIVE, Sqoop and Pig) and IBM Big SQL.
  • Expertise in creating databases, users, tables, triggers, macros, views, stored procedures, functions, Packages, joins and hash indexes in Teradata database.
  • Extensively worked with Teradata utilities like BTEQ, Fast Export, Fast Load, Multi Load to export and load data to/from different source systems including flat files.
  • Proficient in performance analysis, monitoring and SQL query tuning using EXPLAIN PLAN, Collect Statistics,
  • Hints and SQL trace both in Teradata as well as Oracle.
  • Excellent experience with different indexes (PI, SI, JI, AJIs, PPI (MLPPI, SLPPI)), join strategies, partitioning in Teradata.
  • Extensive knowledge of Relational Database Management Systems (RDBMS) and relational and normalized databases.
  • Experience in data modeling with SQL Data modeler and Visio tool.
  • Well versed in exposing stored procedures as web services in Informatica for data warehousing projects.
  • Database end table design and implementation for Data warehouse and related ETL Processes.
  • Analysis, Design, Development and Implementation of Data warehouse, ETL Clint/Server applications.
  • Extensive working experience in data migration using Informatica Power Center.
  • Knowledge in Data Warehouse implementation using tools like Informatica Power Mart and Power Center, Power Connect, Power Exchange CDC.
  • Proficient in using Informatica workflow manager, workflow monitor, server manager, PMCMD (Informatica command line utility) to create, schedule and control workflows, tasks, and sessions.
  • Strong in developing data models including Logical, Physical, Conceptual and additionally dimensional modeling using star schema for data warehousing projects.
  • Experienced in Tuning Informatica Mappings to identify and remove processing bottlenecks.
  • Automation and scheduling of UNIX shell scripts and Informatica sessions and batches using Autosys.

TECHNICAL SKILLS:

Data Warehousing: Informatica Power Center 9.6/8.6/7.2, (Designer, Mappings, Mapplets, Transformations, Workflow Manager, Workflow Monitor) Power Exchange and SAS.

OLAP Tools: Cognos, OBIEE.

Dimensional Data Modeling: Erwin, Dimensional Data Modeling, Data Modeling, Star Schema Modeling, Snow-Flake Modeling, FACT and Dimension Tables.

Databases: Oracle, SQL, PL/SQL, Teradata, SQL*Plus, MS SQL server, MS Access, Netezza, MYSQL, DB2, Big SQL and Hive.

Programming: SQL, PL/SQL, SQL*Plus, C, C++, C#.

Environment: Windows, Linux (Red hat), UNIX.

Testing: Manual, Modular Testing, System Testing, Integration, Unit, Regression and Performance Test

Defect Tracking Tools: Quality Center, Clear Quest, Bugzilla, Visual Studio Team System

Web Services: SAP, Taleo, SFDC and @task.

Tools: Toad, SQL Developer, Teradata SQL Assistant, SVN(Version control) and WIN SCP.

PROFESSIONAL EXPERIENCE:

Confidential, Oakland, CA

Sr. ETL Informatica/ Database Developer

Responsibilities:

  • Extracting the data from Teradata, DB2, Netezza and loading in to DOR VDW using with Informatica & SAS programs.
  • Prepared technical design/specifications for data Extraction, Transformation and Loading.
  • Analyzing the sources, transforming data, mapping the data and loading the data into targets using Informatica Power Center Designer.
  • Used Variables and Parameters in the mappings to pass the values between mappings and sessions.
  • Performed the performance evaluation of the ETL for full load & delta load cycle.
  • Implemented source and target based partitioning for existing workflows in production to improve performance so as to cut back the running time.
  • Analyzed workflow, session, event and error logs for trouble shooting Informatica ETL process.
  • Extensively involved in writing DDL and DML operations.
  • Extensively worked on writing the SQL queries using joins, Order by and Group by.
  • Created Collections for accessing and storing complex data resulted from joining of large number of tables.
  • Developed PL/SQL Packages, Procedures and Functions accordance with the business requirements for loading data into database tables.
  • Tested Complex ETL jobs based on business user requirements and business rules to load data from source RDBMS tables to target tables.
  • Scheduling the Oracle jobs as per the requirement (Daily, weekly and monthly).
  • Monitoring day-to-day Process for different Data Loads and resolving Issues.
  • Worked extensively on exception handling to trouble shoot PL/SQL code.
  • Extensively used TOAD, SQL Developer and Teradata SQL assistant tool to increase the productivity and application code quality.
  • Understanding the business documents and create statements and modify the scripts.
  • Develop SQL Loader Script to load staging area using data from flat file.
  • Involved in testing all the scripts in test database and moved into the production.
  • Involved table partition in database.
  • Created UNIX scripts for handling the ftp of source files, to make them execute in sequence as per the time stamp and to archive the processed files for future reference.
  • Writing Teradata SQL queries to join or any modifications in the table.
  • Creation of customized Mload scripts on UNIX platform for Teradata loads.
  • Analyze and recommend solutions for data issues.
  • Writing Teradata BTEQ scripts to implement the business logic.
  • Involved in testing all the scripts in test database and moved into the production.
  • Involved table partition in database.
  • Created UNIX scripts for handling the ftp of source files, to make them execute in sequence as per the time stamp and to archive the processed files for future reference.
  • Using with SAS PROC SQL, PROC IMPORT, DATA and PROC DOWNLOAD procedures to extract the FIXED Format Flat files, Teradata and load into oracle tables.
  • Created the SAS data sets for comparison is to prove that file formats are the same and number of records is correct.
  • Created the variable frequencies for comparing the source and target data using with SAS.
  • Created the SAS QA programs for test the data.
  • Written Test Cases for ETL to compare Source and Target database systems.
  • Testing of records with logical delete using flags.
  • Maintaining the Release related documents by Configuration management techniques.
  • Monitored indexes and analyzed their status for performance tuning and query optimization.

Environment: Informatica, Oracle SQL and PL/SQL, DB2, SQL SERVER, Teradata, Netezza, SAS and UNIX.

Confidential, Bloomington, IL

ETL Informatica Developer

Responsibilities:

  • Gather and Analyze requirements from Users
  • Designed the functional specifications are source, target, current & proposed process and Interface process flow diagram.
  • Designed ETL process, load strategy and the requirements specification after having the requirements from the end users.
  • Worked with Informatica Data Quality (IDQ) toolkit, Analysis, data cleansing, data matching, data conversion, exception handling, and reporting and monitoring capabilities of IDQ.
  • Maintaining the Release related documents by Configuration management techniques.
  • Involved in data design and modeling, system study, design, and development.
  • Created mappings using Informatica Designer to build business rules to load data.
  • Created Folders and placed all the tested ETL, DB2 and UNIX scripts in the Staging path for DST & Production movement.
  • Created database objects like views, materialized views, procedures, packages using Oracle PL/SQL with SQL developer
  • Created Records, Tables, Collections (Nested Tables and V-arrays) for improving performance by reducing context switching.
  • Created number of database Triggers according to business rules using PL/SQL
  • Involved in loading and Re-loading the County data into the database.
  • Participated in Performance Tuning of SQL queries using Explain Plan to improve the performance of the application.
  • Worked extensively on exception handling for handling errors using system defined exceptions and user defined exceptions.
  • Monitoring the scheduling jobs.
  • Created Unix Shell Scripts for automating the execution process.
  • Created number of database Triggers according to business rules using PL/SQL.
  • Validating the load process of ETL to make sure the target tables are populated according the data mapping provided that satisfies the transformation rules.
  • Ensured that the mappings are correct and conducted data validation testing
  • Tuned ETL jobs/procedures/scripts, SQL queries, PL/SQL procedures to improve the system performance.
  • Involved in data design and modeling, system study, design, and development.
  • Worked on bug fixes on existing Informatica Mappings to produce correct output.
  • Identified the bottlenecks in the source, target, mapping, and loading process and successfully attended/resolved the performance issues across this project.
  • Helping the team in fixing the technical issues if any and Tuning of Database queries for better performance.

Environment: Informatica 9.1, Mainframe DB2, COBOL, UNIX, Oracle SQL and PL/SQL .

Confidential, Ashburn, VA

ETL Informatica Developer

Responsibilities:

  • Developed Informatica parameter files to filter the daily data from the source system.
  • Implemented Type 2 Slowly Changing Dimensions Methodology to keep track of historical data.
  • Identified and eliminated duplicates in datasets thorough IDQ components of edit Distance, Jaro distance and Mixed field matcher, It enables the creation of a single view of customers, help control costs associated with mailing lists by preventing multiple pieces of mail.
  • Prepared the data flow of the entire Informatica objects created, to accomplish testing at various levels unit, performance and Integration and Baseline test.
  • Studied Session Log files to find errors in mappings and sessions.
  • Created UNIX scripts for handling the ftp of source files, to make them execute in sequence as per the time stamp and to archive the processed files for future reference.
  • Worked closely with DBA in creating the Tables, indexes, views, Index rebuilds.
  • Involved table partition in database.
  • Developed PL/SQL Procedures and Functions accordance with the business requirements for loading data into database tables.
  • Developed the SQL code for performance and penalty statistics calculation.
  • Involved in tuning SQL queries.
  • Executing Daily and Monthly reports in UNIX and automated the scripts using shell script.
  • Created SQL* Loader scripts to load data into temporary staging tables.
  • Created scripts for Views, Materialized Views and Partitioned tables.
  • Created the test environment for Staging area, loading the Staging area with data from multiple sources.
  • Extraction of test data from tables and loading of data into SQL tables.
  • Validated the data in the reports by writing simple to complex SQL queries in the transactional system.
  • Responsible for Analyzing and Implementing the Change Requests.
  • Created complex stored procedures, views, SQL joins and scripts.
  • Writing high quality and well documented code according to standards.
  • Involved in fine tuning stored procedures by making use of PL/SQL collections and its BULK FETCH and BULK INSERT features.
  • Involved in daily status calls with clients.
  • Designed ETL process, load strategy and the requirements specification after having the requirements from the end users.
  • Maintaining the Release related documents by Configuration management techniques.
  • Designed the functional specifications are source, target, current & proposed process and Interface process flow diagram.
  • Created mappings using Informatica Designer to build business rules to load data.
  • Extensively worked in data Extraction, Transformation and loading from CSV files, XML Files & S&P (Standard & Poor’s) data to Microsoft SQL Server Data base.
  • Created Folders and placed all the tested ETL, SQL Server and UNIX scripts in the Staging path for production movement.

Environment: Informatica, Oracle SQL and PL/SQL, Quality Center, DB2, Unix Shell Scripting.

Confidential, Orem, UT

ETL Informatica Developer

Responsibilities:

  • Involved actively in full life cycle of project which includes screen design, table design, preparation of user manual, preparation of test plans and development of the program using Oracle 10g
  • Interaction with Development Team in fine Tuning the SQL and PL/SQL Codes.
  • Implementation of Oracle & Application Software Patches.
  • Developed Forms using Forms and custom Reports.
  • Writing database creation scripts & PL/SQL server side stored procedures, functions and triggers, SQL*Net installation and configuration.
  • The front end consists of Oracle Web forms which allow the user to login and enter the input processing information, such as file name, date, month and year and submit the form for the load process.
  • Worked with Informatica Designer of Functional Description, Scope and Detailed functional requirements.
  • Designed and Created data cleansing and validation scripts using Informatica ETL tool.
  • Responsible for creating, importing all the required sources and targets to the shared folder.
  • Worked with Different sources such as Relational Data bases (Oracle, SQL Server, and MY SQL), Flat files, Adobe PDFs, Xml files, Sales Force, SAP/BW, SAP/ECC, Taleo and @task.
  • Utilized of Informatica IDQ 8.6.1 to complete initial data profiling and matching/removing duplicate data.
  • Created mappings using Informatica Designer to build business rules to load data.
  • Created Folders and placed all the tested ETL, Oracle and UNIX scripts in the Staging path for production movement.
  • Created new database objects such as tables, views, indexes, triggers and synonyms.
  • Wrote different new scripts in PL/SQL using Packages, Procedures, Functions and Triggers.
  • Created Collections for accessing and storing complex data resulted from joining of large number of tables.
  • Involved in loading and Re-loading the County data into the database.
  • Extensively used TOAD tool to increase the productivity and application code quality.
  • Understanding the business documents and create statements and modify the scripts.
  • Scheduled the jobs in Dev and Testing environment using Informatica Scheduler.
  • Prepared the data flow of the entire Informatica objects created, to accomplish Integration testing.
  • Prepared Unit Test Plans.
  • Supported user queries against availability of data from Data Warehouse.
  • Performed troubleshooting for non-uniform ness.

Environment: Informatica Power Center 8.6.1, Informatica B2B data exchange and transformation, Oracle 10g/11, SQL Server, MYSQL, Toad 9.1, Tidal 5.3.1, Flat Files, Windows NT, UNIX, Sales Force, SAP/BW,SAP/ECC, Quality Center, Taleo and @Task.

We'd love your feedback!