We provide IT Staff Augmentation Services!

Sr. Etl Developer Resume

2.00/5 (Submit Your Rating)

Fremont, CA

SUMMARY:

  • 7+ years of experience as an ETL Developer with the development of ETL’s for Data warehouse/Data Migration using Informatica Powercenter, OBIEE and SSIS (SQL Server Integration Services) ETL tools.
  • Strong hands on experience using Teradata utilities (SQL, B - TEQ, Fast Load, MultiLoad, FastExport, Tpump, Visual Explain, and Query man), Teradata parallel support, Perl and Unix Shell scripting.
  • Experience working with MS SQL Server 2012+, Oracle 11g, Oracle APEX, BIG Data, Oracle NoSQL and Oracle Exadata.
  • Extensive knowledge on the Powercenter components as Powercenter Designer, Powercenter Repository Manager, Workflow Manager and Workflow Monitor.
  • Performed the data profiling and analysis making use of Informatica Data Explorer (IDE) and Informatica Data Quality (IDQ).
  • Thorough knowledge in creating ETL’s to perform the data load from different data sources and good understanding in installation of Informatica.
  • Experience in integrating business application with Informatica MDM hub using Batch process, SIF and message queues.
  • Ability to ingest data from many different sources, including SOAP/REST web services.
  • Demonstrated experience with design and implementation of Informatica (IDQ v9.1), Data Quality applications for the business and technology users across the entire full development life-cycle.
  • Strong in SQL, T-SQL, PL/SQL, SQL*LOADER, SQL*PLUS, MS-SQL & PRO*C
  • Experience in implementing OBIEE 11.1.1.3.0/10.1.3.3. X,Oracle BI Applications7.9.Xwhich include hands on expertise in RPD development, Siebel/Oracle BI Answers, BI Dashboards, BI Delivers and Reports.
  • Involved in complete software development life cycle (SDLC) of project with experience in domains like Retail, Healthcare Insurance and Automobile.
  • Experience with Data Cleansing, Data Profiling and Data analysis. UNIX Shell Scripting, perl Scripting, SQL and PL/SQL coding.
  • Involved in Testing, Test Plan Preparation and Process Improvement for the ETL developments with good exposure to development, testing, debugging, implementation, documentation, user training & production support.
  • Experience in Oracle supplied packages, Dynamic SQL, Records and PL/SQL Tables.
  • Hands on experience on Various NoSQL databases such as Hbase and MongoDB.
  • Expertise in Master Data Management concepts, Methodologies and ability to apply this knowledge in building MDM solutions.
  • Experience in Informatica mapping specification documentation, tuning mappings to increase performance, proficient in creating and scheduling workflows and expertise in Automation of ETL processes with scheduling tools such as Autosys, IBM Tivoli and Control M.
  • Applied the concept of Change Data Capture (CDC) and imported the source from Legacy systems using Informatica Power Exchange (PWX).
  • Good experience with implementing Informatica B2B DX/DT and Informatica BDE.
  • Good Knowledge of EAI, ESB, B2B integration.
  • Hands on experience in developing Stored Procedures, Functions, Views and Triggers, SQL queries using SQL Server and Oracle PL/SQL.
  • Worked a great extent with the design and development of Tableau and Power BI visualizations which includes Preparing Dashboards using calculations, parameters, calculated fields, groups, sets and hierarchies.
  • Experience in using SSIS tools like Import and Export Wizard, Package Installation, and SSIS Package Designer.
  • Good understanding about Oracle Hyperion performance management
  • Experience in importing/exporting data between different sources like Oracle/Access/Excel etc. using SSIS/DTS utility
  • Expertise in using global variables, expressions and functions for the reports with immense experience in handling sub reports in SSRS.
  • Strong Data Modeling experience using ER diagram, Dimensional data modeling, Star Schema modeling, Snow-flake modeling using tools like Erwin, ERStudio.
  • Worked directly with non-IT business analysts throughout the development cycle, and provide production support for ETL.
  • Having exposure on Informatica Cloud Services.
  • Good Exposure to big data technologies such as Hadoop, Hive, Hbase, Mapreduce, HDFS.
  • Experience with industry Software development methodologies like Waterfall, Agile within the software development life cycle.
  • Experience in source systems analysis and data extraction from various sources like Flat files, Oracle 11g/10g/9i/8i, DB2 UDB.
  • Involved in understanding client Requirements, Analysis of Functional specification, Technical specification preparation and Review of Technical Specification.
  • Hands-on experience across all stages of Software Development Life Cycle (SDLC) including business requirement analysis, data mapping, build, unit testing, systems integration and user acceptance testing.
  • Good exposure in interacting with Restful web services, SaaS, PaaS, and IaaS

TECHNICAL SKILLS:

Data warehousing: Informatica Power Center 10/9.6/9.1/8.6.1/8.5/8.1.1/7.1.2/7.1.1/6.1/5.1.2, Power Connect, Power Exchange, Informatica PowerMart 6.2/5.1.2/5.1.1/5.0/4.7.2, Informatica MDM 10.1/9.X, OBIEE 11g/10g, Informatica CDC, Fp Talend RXT 4.1, Informatica BDE, Informatica B2B DX/DT, SQL*Loader, Informatica on demand IOD, Flat Files (Fixed, CSV, Tilde Delimited, XML, IDQ, IDE, Oracle Data Integrator (ODI), ETL Tools Data Transformation Services (DTS), Metadata Manager, MS SQL Server Integration Services (SSIS).

Process/Modeling Tool: Software Development Life Cycle (SDLC), MS Visio, Rational Rose, Rational Requisite Pro, Rational Clear Quest, VSS.

Scripting: UNIX, Shell Scripting, Perl Scripting

Scheduling Tools: Autosys, Control M, IBM Tivoli

Methodology: UML, RUP, Agile, Business Modeling

Databases: MS Access, SQL Server, Oracle, Teradata, DB2, Hive

Languages: SQL, Java, XML and HTML, Mainframe, .NET

Operating Systems: MS-DOS, WINDOWS 98/2000/NT/XP

Office Tools: MS Office 2003/2000, Word, Excel, PowerPoint, Access

Testing tools: QTP, Quality Center

WORK EXPERIENCE:

Confidential, Fremont, CA

Sr. ETL Developer

Responsibilities:

  • Used Informatica PowerCenter for extraction, transformation and load (ETL) of data in the data warehouse.
  • Setting up the ETL logic to populate various BI reports and tables.
  • Setting up the Oozie workflow to keep the ETL jobs running and updated.
  • Using Informatica Power centre tools developed Workflows using task developer, worklets designer, and workflow designer in Workflow manager and monitored the results using workflow monitor.
  • Expertise in creating Packages using SQL Server Integration Services (SSIS).
  • Designed SSIS Packages to transfer data from flat files to SQL Server using Business Intelligence Development Studio.
  • Worked on optimizing and tuning the Teradata views and SQL’s to improve the performance of batch and response time of data for users.
  • Extracted data from Teradata database.
  • Sole developer building a process in IDQ for cleansing and migrating global customer data from Oracle 11i into 12i.
  • Executed, scheduled workflows using Informatica Cloud tool to load data from Source to Target.
  • Migrated data from legacy databases in Oracle Exadata to Hive.
  • Used Informatica power center 9.6.1 to Extract, Transform and Load data into Netezza Data Warehouse from various sources like Oracle and flat files.
  • Involved in Analyzing/ building Teradata EDW using Teradata ETL utilities and Informatica.
  • Data Modelling and high Level designing (HLD), Analysis and Preparation of Low level design (LLD).
  • Configured designed and delivered MDM hubs across multiple data domains Party Service/Product Prospect.
  • Performed extract, transform, load between traditional rdbms or hive/hdfs source and targets using Informatica BDE.
  • Importing data from oracle database to HiveQL using Scoop.
  • Develop, validate and maintain HiveQL queries.
  • Running reports in Pig and Hive Queries. Analyzing data with Hive, Pig.
  • Web Methods installation, clustering, apply patches, and maintenance of application servers.
  • Used Informatica Power Exchange for loading/retrieving data from mainframe system.
  • Analyzed data sources and targets using Informatica Data Profiling option.
  • Write Shell script running workflows in UNIX environment.
  • Worked along with UNIX team for writing UNIX shell scripts to customize the server scheduling jobs.
  • Worked with Control- M for Scheduling jobs.
  • Applied slowly changing dimensions like Type 1 and 2 effectively to handle the delta Loads.
  • Prepared various mappings to load the data into different stages like Landing, Staging and Target tables.
  • Worked with Informatica Power Exchange as well as Informatica cloud to load the data into salesforce.com
  • Used various transformations like Source Qualifier, Expression, Aggregator, Joiner, Filter, Lookup, Update Strategy Designing and optimizing the Mapping.
  • Extracted data from Oracle Exadata and SQL Server then used Teradata for data warehousing.
  • Implemented slowly changing dimension methodology for accessing the full history of accounts.
  • Wrote PL/SQL scripts for pre & post session processes and to automate daily loads.
  • Wrote heavy stored procedures using dynamic SQL to populate data into temp tables from fact and dimensional tables for reporting purpose.
  • Optimizing performance tuning at source, target, mapping and session level
  • Created various tasks like Session, Command, Timer and Event wait.
  • Modified several of the existing mappings based on the user requirements and maintained existing mappings, sessions and workflows.
  • Develop workflows to data domains for performing extraction validation cleansing matching loading exception handling and other data related activities to support Customer s MDM requirements.
  • Cleanse and Scrub the Data in uniform data type and format applying MDM Informatica and IDQ tools and Load to STAGE and HUB tables and then to the EDW and Finally to Dimension and rollup/aggregate
  • Tuned the performance of mappings by following Informatica best practices and also applied several methods to get best performance by decreasing the run time of workflows.
  • Prepared SQL Queries to validate the data in both source and target databases.
  • Wrote UNIX scripts, Perl scripts for the business needs.
  • Participated in weekly status meetings, and conducting internal and external reviews as well as formal walk through among various teams and documenting the proceedings.

Environment: Informatica power center 10/9.6, Informatica Multidomain MDM 9.5.0, Teradata 12, SSIS, IDQ, Oracle 11/10g, PL/SQL, Autosys, TOAD 9.x, Hive, MS SQL Server 2012, OBIEE, Oracle SQL *Loader, SSIS 2008, UNIX, Shell scripting, Windows-XP.

Confidential, San Antonio, TX

Informatica Developer

Responsibilities:

  • Developed mappings/sessions using Informatica Power Center 8.6.1 for data loading.
  • Worked with Informatica Data Quality 8.6.1 (IDQ) toolkit, Analysis, data cleansing, data matching, data conversion, exception handling, and reporting and monitoring capabilities of IDQ 8.6.1.
  • Performed data manipulations using various Informatica Transformations like Filter, Expression, Lookup (Connected and Un-Connected), Aggregate, Update Strategy, Normaliser, Joiner, Router, Sorter and Union.
  • Used the PL/SQL procedures for Informatica mappings for truncating the data in target tables at run time.
  • Created mappings and sessions to implement technical enhancements for data warehouse by extracting data from sources like Oracle Exadata and Delimited Flat files.
  • Involved in massive data profiling using IDQ prior to data staging.
  • Have very good knowledge on all the data quality transformation which will be used throughout the development.
  • Prepared Business Requirement Documents, Software Requirement Analysis and Design Documents (SRD) and Requirement Traceability Matrix for each project workflow based on the information gathered from Solution Business Proposal document.
  • Used IDQ’s standardized plans for addresses and names clean ups.
  • Worked on IDQ file configuration at user’s machines and resolved the issues.
  • Used IDQ to complete initial data profiling and removing duplicate data.
  • And extensively worked on IDQ admin tasks and worked as both IDQ Admin and IDQ developer.
  • Design reference data and data quality rules using IDQ and involved in cleaning the data using IDQ in Informatica Data Quality 9.1 environment.
  • Data modeling and design of data warehouse and data marts in star schema methodology with confirmed and granular dimensions and FACT tables.
  • Creating and managing Virtual machines in Windows Azure and setting up communication with the help of Endpoints and also VM migrations from Transactional hosts.
  • Worked on Inbound, Outbound and Carve-out Data Feeds and developed mappings, sessions, workflows, command line tasks etc. for the same.
  • Coordinated and worked closely with legal, clients, third-party vendors, architects, DBA’s, operations and business units to build and deploy.
  • Used Data stage to manage the Metadata repository and for import /export for jobs
  • Used IDQ address validation transformation to generate CASS and AMAS report for US and China addresses
  • Worked with various Informatica Transformations like Joiner, Expression, Lookup, Aggregate, Filter, Update Strategy, Stored procedure, Router and Normalizer etc.
  • Worked with Connected and Unconnected Stored Procedure for pre & post load sessions
  • Designed and Developed pre-session, post-session routines and batch execution routines using Informatica Server to run Informatica sessions.
  • Used PMCMD command to start, stop and ping server from UNIX and created Shell scripts to automate the process.
  • Worked on production tickets to resolve the issues in a timely manner.
  • Prepared Test Strategy and Test Plans for Unit, SIT, UAT and Performance testing.

Environment: Informatica Power Center 9.5, Informatica IDQ, Oracle 10g, Teradata, SQL Server 2008, Toad, SQL Plus, SQL Query Analyzer, SQL Developer, MS Access, Windows NT, Shell Scripting, Clear Quest, Tivoli Job Scheduler, Windows Azure.

Confidential, Dallas, TX

Informatica Developer

Responsibilities:

  • Prepared technical design/specifications for data Extraction, Transformation and Loading.
  • Created Informatica sessions in workflow manager to load the data from staging to Target database.
  • Worked on Informatica Utilities Source Analyzer, warehouse Designer, Mapping Designer, Mapplet Designer and Transformation Developer.
  • Designed SQL Server Integration Services (SSIS) packages to extract data from JDE to SQL Server Data Warehouse following the JDE specs and SQL Server standards.
  • Responsible for performance tuning at all levels of the Data warehouse.
  • Analyzing the sources, transforming data, mapping the data and loading the data into targets using Informatica Power Center Designer.
  • Created reusable transformations to load data from operational data source to Data Warehouse and involved in capacity planning and storage of data.
  • Developed complex mappings such as Slowly Changing Dimensions Type II-Time stamping in the Mapping Designer.
  • Performance Tuning in MS SQL Server 2005 using SQL Profiler and Data Loading.
  • Used various transformations like Stored Procedure, Connected and Unconnected lookups, Update Strategy, Filter transformation, Joiner transformations to implement complex business logic.
  • Used Informatica Workflow Manager to create workflows, database connections, sessions and batches to run the mappings.
  • Used Variables and Parameters in the mappings to pass the values between mappings and sessions.
  • Created Stored Procedures, Functions, Packages and Triggers using PL/SQL.
  • Implemented restart strategy and error handling techniques to recover failed sessions.
  • Used Unix Shell Scripts to automate pre-session and post-session processes.
  • Did performance tuning to improve Data Extraction, Data process and Load time.
  • Wrote complex SQL Queries involving multiple tables with joins.
  • Implemented best practices as per the standards while designing technical documents and developing Informatica ETL process.

Environment: Informatica 8.6/9.1, Oracle 10g, Shell scripting, SQL server 2005, SQL, T-SQL, PL/SQL, Toad, Erwin4.x, UNIX, Tortoise SVN, Flat files.

Confidential

Informatica Developer

Responsibilities:

  • Actively Involved in Informatica environment upgrade from version 7.1 to 8.1.
  • Created Technical Specification and Data Mapping documents by gathering functional requirements from Business Analysts.
  • Developed Informatica mappings and workflows to create intermediate data files by integrating data from feeds and database.
  • Used SQL Loader to load data from delimited files into Oracle tables.
  • Extensively developed UNIX Shell scripts to transfer and archive account files.
  • Developed UNIX Shell Scripts and SQLs to get data from Oracle tables before executing Informatica workflows.
  • Developed Oracle Stored Procedures to update the data in control tables and account tables.
  • Maintained versions of code and supporting documents in Clear Case (UNIX)
  • Prepared and deployed build from DEV to other environments.
  • Created Autosys JIL scripts, Control files, Parameter files and UNIX Environmental files.
  • Coordinated with QA, BSA and development teams during System Integration and User Acceptance Testing to accurately resolve the issues.
  • Supported Production jobs during day-time.
  • Created and tracked CRs for requirement changes and enhancements.
  • Conducted Knowledge sharing sessions with the Development team.

Environment: Informatica Power Center 8.1.1, UNIX Shell Scripting, Windows NT, Oracle 9i, PL/SQL, Autosys, Clear case (in UNIX)

We'd love your feedback!