We provide IT Staff Augmentation Services!

Sr. Etl/bi Developer Resume

2.00/5 (Submit Your Rating)

Emeryville, CA

SUMMARY

  • 9+ years of experience in Information Technology wif a strong background in Database development and Data warehousing.
  • Created Good experience in designing, Implementation of Data warehousing and Business Intelligence solutions using ETL tools like Informatica Power Center, Informatica Power Exchange, and Informatica Intelligent Cloud Services (IICS), IBM Infosphere DataStage.
  • Good understanding about design, architecture, and implementation of Informatica in mapping Designer to load data from various sources using transformations like Transaction Control, Lookup (Connected and Un - connected), Router, Filter, Expression, Aggregator, Joiner and Update Strategy, SQL, Stored Procedure, Unstructured data transformation, Sql transformation and more.
  • Excellent Knowledge on Slowly Changing Dimensions (SCD Type1, SCD Type 2, SCD Type 3), Change Data Capture, Dimensional Data Modeling, Ralph Kimball Approach, Star/Snowflake Modeling, Data Marts, OLAP and FACT and Dimensions tables, Physical and Logical data modeling.
  • Experience working wif IICS concepts relating to data integration, Monitor, Administrator, deployments, permissions, schedules.
  • Good Understanding about Sql DWH concepts relating to storage, distribution, DWU units, resource user groups, connection strings etc.
  • Experience in developing Transformations, Mapplets and Mappings in Informatica Designer and creating tasks using Workflow Manager to move data from multiple sources to target.
  • Worked in different phases of the projects involving Requirements Gathering and Analysis, Design, Development, Testing, Deployment and Support.
  • Good Knowledge on Database architecture of OLTP and OLAP applications, Data Analysis.
  • Worked wif wide variety of sources like Relational Databases, Flat Files, XML files, Mainframes, Salesforce Objects, Unstructured data files and Scheduling tools like CA7, Control-M and Informatica Scheduler.
  • Experience working wif various Informatica concepts like partitioning, Performance tuning, identifying bottlenecks, deployment groups, Informatica scheduler and more.
  • Very Strong skills and clear understanding of requisites and solutions to various issues in implementation throughout the Software Development life cycle (SDLC).
  • Have worked on agile methodology.
  • Hands on experience in PL/SQL (Stored Procedures, Functions, Packages, Triggers, Cursors, Indexes), UNIX Shell scripting and Windows Batch scripting.
  • Expertise in doing Unit Testing, Integration Testing, System Testing and Data Validation for Developed Informatica Mappings.
  • Experience working in Production Support, Support for Emergency fix and migrating fix from lower environment to higher environment as per the policies.
  • Excellent Verbal and Written Communication Skills. Have proven to be highly TEMPeffective in interfacing across business and technical groups.

TECHNICAL SKILLS

ETL Tools: Informatica Power Center 10.2/9.6.1/9.5.1/9.1/8.6. x, Informatica Power Exchange 10.2/ 9.5.1, Informatica developer 9.6.1, Informatica Intelligent Cloud Services (IICS), IBM Infosphere DataStage

Data Modeling: Star Schema, Snowflake Schema, Erwin 4.0, Dimension Data Modeling.

Databases: Oracle11g/10g/9i, SQL Server, DB2, Teradata

Scheduling Tools: CA7 Scheduler, TWS(Tivoli), Informatica Scheduler, Control M.

Reporting Tools: MicroStrategy, and Hyperion Essbase.

Programming: SQL, PL/SQL, Transact SQL, HTML, XML, Java, Shell, Perl, Batch

Operating Systems: Windows 7, DOS, UNIX, and LINUX

Other Tools: SQL*Plus, Toad, SQL Developer, Putty, WINSCP, MS-Office, Spartan

PROFESSIONAL EXPERIENCE

Confidential, Emeryville, CA

Sr. ETL/BI Developer

Responsibilities:

  • Developed and maintained ETL (Data Extraction, Transformation and Loading) mappings using
  • Informatica Designer 8.6 to extract the data from multiple source systems dat comprise databases like Oracle 10g, SQL Server 7.2, XML, csv, xls, flat files processing to the Staging area, EDW and then to the Data Marts.
  • Created mappings using different look-ups like connected, unconnected and dynamic look-up and using reusable components like worklets, mapplets using other reusable transformations.
  • Created and executed mapping using Slowly Changing Dimensions Type 2, 1 and 4.
  • Developed mapping for fact tables loading from various dimension tables.
  • Developed and created the new database objects including tables, views, index, stored procedures and functions, advanced queries and updated statistics using Oracle Enterprise Manager on the existing servers
  • Wrote Insert triggers which updates the same row which is being inserted (mutating trigger).
  • Worked extensively on most of the transformations such as Aggregator, joiner, Connected and
  • Unconnected Lookups, Normalizer, Sequence Generator, Union, Ranker, Update Strategy etc.
  • Experience in implementing SWIFT message transformation
  • Perform data quality analysis, standardization and validation, and develop data quality metrics and insuring the data quality in Source and Target levels to generate proper data report and profiling.
  • Fixing invalid Mappings, testing of Stored Procedures and Functions, Unit and Integration Testing of Informatica Sessions, Batches and Target Data.
  • Debug and analyze existing mappings using Informatica Debugger to construct valid mappings.
  • Scheduled Sessions and Batches on the Informatica Server using Informatica Server Manager/Workflow Manager.
  • Provided and maintained support to Operations team for changes of BI reports (Crystal reports,Adhoc reports etc.,)
  • Extensively involved handling daily batch jobs, load failures, delta failures, Performance issues, BIA issues, System issues, reporting issues, locking issues.
  • Provided On call Support for Production (Prod) Issues and also support for various SDLC Cycles like
  • UAT, SIT, PFIX.
  • Extensively used SAP BO to configure Web URLs, Email, and databases to prepare reports and make reports visually appealing by adding different report formatting options and also provides back-end support functional testing and performance testing.
  • Involved in creating and modifying existing logical and physical data models. Generating scripts for physical data model.
  • Worked on master-child packages technique (Kimballs Approach) to manage data loading to data warehouse efficiently.
  • Perform independent audits, reviews and validate BI artifacts (stored procedures, queries, reports, universes, etc.) to ensure data integrity.

Environment: Informatica 10.0.1 / Oracle 10g/CA ERWIN Data Modeler 8 /SAP BO/UNI/PUTTY/SQL Developer

Confidential, Philadelphia, PA

ETL SAS Developer

Responsibilities:

  • Involved in requirement gathering, analysis and design.
  • Develop the Profiling Job for Data Analysis.
  • Develop Data and Process Jobs.
  • Profiling, cleansing, and standardizing the data.
  • Identifying the suitable data type and definition for the fields.
  • Design High level and Detail Design for the project.
  • Generating match codes and identifying the Sensitivity.
  • Cluster the data for deduplication.
  • Tune the Job for better Performance.
  • Developed mappings to load into staging tables and then to Dimensions and Facts.
  • Used existing ETL standards to develop these mappings.
  • Developed Re-usable cleansing services which can be used across for various projects.
  • Update the Configuration files for Better Performance.

Environment: Data management Studio2.9, Linux, Informatica 9.6, Oracle, Flat File, UNIX

Confidential, Philadelphia, PA

ETL Informatica Intelligent Cloud developer IICS

Responsibilities:

  • Interact wif the Business users to identify the process metrics and various key dimensions and Facts and involved in full life cycle of the project.
  • Assist architect on developing STG/ ODS / Hub / dimensional warehouse in Teradata Data warehouse.
  • Assist in defining logical and physical database models for building new enterprise data warehouse in cloud to replace existing on-premises warehouse.
  • Identify ETL specifications based on business requirements and creating ETL Mapping Documents, high level documentation for the product owners and data managers.
  • Define modelling and naming standards and Best Practices for the Modelling team to use in the Data models as well as in the DDLs and DMLs while creating new data elements and adding attributes.
  • TEMPEffectively using IICS Data integration console to create mapping templates to bring data into staging layer from different source systems like Sql Server, Oracle, Teradata, Salesforce, Flat Files, Excel Files, PWX CDC
  • Experience working wif IICS transformations like Expression, joiner, union, lookup, sorter, filter, normalizer, and various concepts like macro fields to templatize column logic, smart match fields, renaming bulk fields and more.
  • Experience wif PWX concepts including registration maps, logger, listener, condense files.
  • Experience wif Implementing Email Notifications for Success/Failure Mapping Tasks and send the Emails to Business Users (different groups) Notifying them of data/files.
  • Experience working wif Custom Built Query to load dimensions and Facts in IICS.
  • Extensively worked on creating datasets Dashboards for reporting in salesforce Einstein Analytics.
  • Experience working wif Salesforce connector to read data from Salesforce objects into Cloud Warehouse using IICS.
  • Performed bulk load of JSON data from s3 bucket to snowflake.
  • Used Snowflake functions to perform semi structures data parsing entirely wif SQL statements.
  • Experience working wif IICS monitoring, administrator concepts.
  • Experience working wif Data integration concepts not limited to mapping, mapping configuration task, Task flows, deployment using GIT automation, schedules, connections, API integration.
  • Experience working wif Key Range Partitioning in IICS, handling File loads wif concept of File list option, creating fixed wif file format and more, file listener and more.
  • Experience integrating data using IICS for reporting needs.
  • Experience in building semantic layer post to fact loads for reporting to connect to data warehouse.
  • Responsible for deployments in Higher Environments and prod support for warranty period before turning over to managed services.
  • Worked in Agile methodology wif 2 sprint on enhancements.

Environment: Informatica Intelligent Cloud Service (IICS), Data Integration, Salesforce Einstein Analytics, GIT, Teradata, SQL Server Management Studio, Informatica Power Center 10.2, UNIX, Spartan

Confidential, Charlotte, NC

ETL Informatica PowerCenter

Responsibilities:

  • Responsible for Business Analysis and Requirements Collection.
  • Worked on Informatica Power Center tools- Designer, Repository Manager, Workflow Manager, and Workflow Monitor.
  • Parsed high-level design specification to simple ETL coding and mapping standards.
  • Designed and customized data models for Data warehouse supporting data from multiple sources on real time
  • Involved in building the ETL architecture and Source to Target mapping to load data into Data warehouse.
  • Created mapping documents to outline data flow from sources to targets.
  • Involved in Dimensional modeling (Star Schema) of the Data warehouse and used Erwin to design the business process, dimensions, and measured facts.
  • Extracted the data from the flat files and other RDBMS databases into staging area and populated onto Data warehouse.
  • Maintained stored definitions, transformation rules and targets definitions using Informatica repository Manager.
  • Used various transformations like Filter, Expression, Sequence Generator, Update Strategy, Joiner, Stored Procedure, and Union to develop robust mappings in the Informatica Designer.
  • Developed mapping parameters and variables to support SQL override.
  • Created mapplets to use them in different mappings.
  • Developed mappings to load into staging tables and then to Dimensions and Facts.
  • Used existing ETL standards to develop these mappings.
  • Worked on different tasks in Workflows like sessions, events raise, event wait, decision, e-mail, command, worklets, Assignment, Timer, and scheduling of the workflow.
  • Created sessions, configured workflows to extract data from various sources, transformed data, and loading into data warehouse.
  • Used Type 1 SCD and Type 2 SCD mappings to update slowly Changing Dimension Tables.
  • Extensively used SQL* loader to load data from flat files to the database tables in Oracle.
  • Modified existing mappings for enhancements of new business requirements.
  • Used Debugger to test the mappings and fixed the bugs.
  • Wrote UNIX shell Scripts & PMCMD commands for FTP of files from remote server and backup of repository and folder.
  • Involved in Performance tuning at source, target, mappings, sessions, and system levels.
  • Prepared migration document to move the mappings from development to testing and then to production repositories.

Environment: Informatica Power Center 9.6.1, Workflow Manager, Workflow Monitor, PL/SQL, Oracle 10g/9i, Erwin, Autosys, Sybase, UNIX AIX, Toad 9.0.

Confidential, Plano, TX

ETL Informatica PowerCenter

Responsibilities:

  • Extraction, Transformation and Loading of the data using Informatica.
  • Designed the target load process based on the requirements.
  • Enhancing the existing mappings where changes are made to the existing mappings using Informatica Power center.
  • Testing and debugging the Enhanced mappings.
  • Creating the ETL run book.
  • Develop Mappings and Workflows to generate staging files.
  • Develop Mappings and Workflows to load the data into Oracle tables.
  • Developed various transformations like Source Qualifier, Update Strategy, Lookup transformation, Expressions and Sequence Generator for loading the data into target table.
  • Created Workflows, Tasks and database connections using Workflow Manager.
  • Developed complex Informatica mappings and tuned them for better performance.
  • Created sessions and batches to move data at specific intervals & on demand using Server Manager.
  • Responsibilities include creating the sessions and scheduling the sessions.
  • Recovering the failed Sessions and Batches.
  • Involving in extracting the data from SQL Server, Sybase, and Flat files.
  • Implemented performance tuning techniques by identifying and resolving the bottlenecks in source, target, transformations, mappings, and sessions to improve performance Understanding the Functional Requirements.
  • Responsible for identifying the missed records in different stages from source to target and resolving the issue.
  • Extensively worked in the performance tuning for mappings and ETL procedures both at mapping and session level.
  • Developing, Testing, and debugging the mappings in Informatica.
  • Good experience in UNIX working environment.
  • Preparing the documents for test data loading.

Environment: Informatica Power center 9.6, Sybase, Oracle, Flat File, UNIX shell scripts, Windows.

Confidential, Charlotte, NC

ETL Informatica Developer

Responsibilities:

  • Analyzed theBusiness Requirement Documents (BRD)and laid out the steps for the data extraction, business logic implementation & loading into targets.
  • Responsible forImpact Analysis, upstream/downstreamimpacts.
  • Created detailedtechnical specifications for Data Warehouseand ETL processes.
  • UsedInformatica as ETL tool to pull data from source systems/ files, cleanse, transform and load data into the Teradatausing Teradata Utilities.
  • Worked on Informatica- Source Analyzer, Warehouse Designer, Mapping Designer & Mapplet, and Transformation Developer.
  • Used most of the transformations such as theSource Qualifier, Expression, Aggregator, Filter, Connected and Unconnected Lookups, Joiner, update strategy and stored procedure.
  • Extensively usedPre-SQL and Post-SQL scriptsfor loading the data into the targets according to the requirement.
  • Developed mappings to load Fact, Dimension tables (SCD Type 1 and SCD Type 2 dimensions) and Incremental loading and unit tested the mappings.
  • Successfullyupgraded Informatica 9.1 and to 9.5and responsible for validating objects in new version of Informatica.
  • Involved inInitial loads, Incremental loads, and Daily loadsto ensure dat the data is loaded in the tables in a timely and appropriate manner.
  • Extensively worked in theperformance tuning of Teradata SQL, ETL and other processes to optimize session performance.
  • Loaded data into the Teradata tables usingTeradata Utilities Bteq, Fast Load, Multi Load, and Fast Export, TPT.
  • Worked extensively wif differentCaches such as Index cache, Data cache and Lookup cache(Static, Dynamic and Persistence) while developing the Mappings.
  • Created Reusable transformations, Mapplets, Worklets using Transformation Developer, Mapplet Designer and Worklet Designer.
  • Integrated the data into centralized location. Used migration, redesign, and Evaluation approaches.
  • Responsible for Unit Testing, Integration Testing and helped wif User Acceptance Testing.
  • Tuned theperformance of mappings by following Informatica best practicesand applied several methods to get best performance by decreasing the run time of workflows.
  • Worked extensively on Informatica Partitioning when dealing wif huge volumes of data and partitioned the tables in Teradata for optimal performance.
  • Scheduling Informatica jobs and implementing dependencies if necessary, using Autosys.
  • Managed postproduction issues and delivered all assignments/projects wifin specified timelines.

Environment: Informatica Power Center 9.5, Informatica Metadata Manager, Netezza, Autosys, UNIX, Toad, Erwin.

Confidential, Emeryville, CA

ETL Informatica Developer

Responsibilities:

  • Gatheird business requirements from Business Analyst.
  • Designed and implemented appropriate ETL mappings to extract and transform data from various sources to meet requirements.
  • Designed and developed ETL mappings to extract master and transactional data from heterogeneous data feeds and load
  • Worked on loading of data from several flat files to Targets.
  • Designed the procedures for getting the data from all systems to Data Warehousing system.
  • Created the environment for Staging area, loading the Staging area wif data from multiple sources.
  • Analyzed business process workflows and assisted in the development of ETL procedures for moving data from source to target systems.
  • Database connection management and scheduling of jobs.
  • Monitored sessions using the workflow monitor, which were scheduled, running, completed, or failed. Debugged mappings for failed sessions.

Environment: IBM Infosphere DataStage, Oracle 9i, Windows NT, Flat files, SQL, and UNIX Shell Scripts.

We'd love your feedback!