We provide IT Staff Augmentation Services!

Dw/etl Developer Resume

0/5 (Submit Your Rating)

Sunnyvale, CA

SUMMARY

  • Over 8+ years of experience as an ETL Developer with the development of ETL for Dataware House/Data Migration using Informatica Power Center version and SSIS (SQL Server Integration Services) ETL tools.
  • Data Processing experience in designing and implementing Data Mart applications, mainly transformation processes using ETL tools Informatica Power Center, IDQ.
  • Extensive experience in the design and development of Data Warehouse applications primarily in Oracle using PL/SQL programming and IBM Data stage for ETL, Erwin for data modelling, Perl/shell scripting for batch processing with Unix/Linux as solution environment.
  • Strong experience in Data warehousing, Dimensional Star Schema and Snowflakes Schema methodologies.
  • Strong hands on experience usingTeradata utilities (SQL, B - TEQ, Fast Load, Multi load, Fast Export, Tpump, Visual Explain, and Query man), Teradata parallel support, Perl and Unix Shell scripting.
  • Experience working with MS SQL Server, Oracle 11g, Oracle APEX, BIG Data, Oracle No SQL and Oracle Exadata.
  • Experience in implementing OBIEE 11.1.1.3.0/10.1.3.3. X, Oracle BI Applications7.9.X which include hands on expertise in RPD development, Siebel/Oracle BI Answers, BI Dashboards, BI Delivers and Reports.
  • Hands on experience indatamodelling(Erwin)data analysis, data integration, data mapping,ETL/ELT processes and in applying dimensional datamodelling(star schema) concepts in data warehouse.
  • Hands on experience in developing Stored Procedures, Functions, Views and Triggers, SQL queries using SQL Server and Oracle SQL.
  • Extensively used Pentaho report designer, Pentahokettle,PenathoBI server.
  • Performed the data profiling and analysis making use of Informatica Data Explorer (IDE) and Informatica Data Quality (IDQ).
  • Experience in Oracle supplied packages, Dynamic SQL, Records and PL/SQL Tables.
  • Hands on experience on Various No SQL databases such as H base and Mongo DB.
  • Experience in integrating business application with Informatica MDM hub using Batch process, SIF and message queues.
  • Expertise in Master Data Management concepts, Methodologies and ability to apply this knowledge in building MDM solutions.
  • Applied the concept of Change Data Capture (CDC) and imported the source from Legacy systems using Informatica Power Exchange (PWX).
  • Good experience with implementing Informatica B2B DX/DT and Informatica BDE.
  • Good Knowledge of EAI, ESB, B2B integration.
  • Experience in using SSIS tools like Import and Export Wizard, Package Installation, and SSIS Package Designer.
  • Experience in importing/exporting data between different sources like Oracle/Access/Excel etc. using SSIS/DTS utility
  • Expertise in using global variables, expressions and functions for the reports with immense experience in handling sub reports in SSRS.
  • Worked directly with non-IT business analysts throughout the development cycle, and provide production support for ETL.
  • I have experience with ETL tool ODI and I am familiar with Version 12 C.
  • Participated in Business Requirements, Use Cases, and Agile Stand up and Defect Triage meetings.
  • Having exposure on Informatica Cloud Services.
  • Good Exposure to big data technologies such as Hadoop, Hive, H base, Mapreduce, HDFS.
  • Experience with industry Software development methodologies like Waterfall, Agile within the software development life cycle.

TECHNICAL SKILLS

Data Warehousing: Informatica Power Center 10/9.6/9.1/8.6.1/8.5/8.1.1/7.1.2/7.1.1/6.1/5.1.2, Power Connect, Power Exchange, Informatica Power Mart 6.2/5.1.2/5.1.1/5.0/4.7.2, Informatica Web services, Informatica MDM 10.1/9.X, OBIEE 11g/10g, Oracle Data Integrator 12c/11g, OBIA/BI APPS11g/ 7.9.6.x/7.9.5, Oracle Data warehouse builder (OWB), Informatica CDC, Talend RXT 4.1, Informatica BDE, Informatica B2B DX/DT(version 10), SQL*Loader, Informatica on demand IOD, Flat Files (Fixed, CSV, Tilde Delimited, XML, IDQ, IDE, Oracle Data Integrator (ODI), ETL Tools Data Transformation Services (DTS), Exadata, Metadata Manager, MS SQL Server Integration Services (SSIS).

Dimensional Data Modelling: Dimensional Data Modelling, Star Schema Modelling, Snow-Flake Modelling, FACT and Dimensions Tables, Physical and Logical Data Modelling.

Scheduling Tool: OBIEE DAC, Autosys, Tidal, Control M, Puppet, Chef

Reporting Tools: SSRS, Business Intelligence Tools, Tableau, Power BI, Cognos 8, MS Access

Database and related tools: Oracle 10g/9i/8i/8/7.x, MS SQL Server 2000/7.0/6.5, Teradata, Netezza, Amazon S3, Vertica, Sybase ASE, PL/SQL, T SQL, NoSQL, TOAD 8.5.1/7.5/6.2. DB2 UDB, Amazon Red shift, Red hat Enterprise Linux

Languages: SQL, PL/SQL, SQL*Plus, C, Dynamic SQL, C#, Working knowledge of Unix Shell Scripting, Perl scripting, Java

Web Technologies: HTML, XHTML and XML

Operating Systems: Microsoft XP/NT/2000/98/95, UNIX, Sun Solaris 5

Cloud Technologies: AWS, Azure, Informatica Cloud

PROFESSIONAL EXPERIENCE

Confidential, Carol Stream, IL

Senior ETL/Informatica Developer

RESPONSIBILITIES:

  • Using Informatica Power centre tools developed Workflows using task developer, work lets designer, and workflow designer in Workflow manager and monitored the results using workflow monitor.
  • Gathered requirements from accounting end users on various exchange fees and documented it along with thedata analysis, data mapping,design, datamodel,test plan, implementation plan and process flow diagrams.
  • Prepared various mappings to load the data into different stages like Landing, Staging and Target tables.
  • In EDW we load the consignment related data based business rules, this data we extract from SAP source system.
  • Based on requirements, developed Source-To-Target mapping document with business rules and also developed ETL Spec documentation.
  • Implemented / Developed incremental ETL mappings.
  • Extracted data from Oracle and SQL Server then used Teradata for data warehousing.
  • Implemented slowly changing dimension methodology for accessing the full history of accounts.
  • Wrote PL/SQL scripts for pre & post session processes and to automate daily loads.
  • Expertise in developing applications, batch processes usingPERL, Shell Scripting, JavaScript since two years.
  • Wrote heavy stored procedures using dynamic SQL to populate data into temp tables from fact and dimensional tables for reporting purpose.
  • ImplementedInformaticacloud data integration withAmazonRedshift andS3.
  • Developed Views and Templates withPythonand using Django's view controller and template language, Website interface is created.
  • Createdlogicalandphysicaldatamodelsusing Erwin, process flow diagrams and data mapping documents.
  • Extracted data from Teradata, BIG Data, and Oracle No SQL and Oracle Exadata databases.
  • Design and develop methodologies to migrate multiple development/production databases from Sybase to Oracle 11g.
  • Expertise in Dynamic SQL, Collections and Exception handling.
  • Extensively used Perl scriptsto edit the xml files and calculate line count according to the client's need.
  • Designed SSIS Packages to transfer data from flat files to SQL Server using Business Intelligence Development Studio.
  • DevelopedInformaticacode to extract the data from different SAP systems like ECC, BW, PLM and load into Hadoop database at HDFS andhivelevel.
  • Worked on optimizing and tuning theTeradataviews andSQL’sto improve the performance of batch and response time of data for users.
  • Used Informatica power center 9.6.1 to Extract, Transform and Load data into Netezza Data Warehouse from various sources like Oracle and flat files.
  • Involved inAnalyzing/ buildingTeradata EDWusingTeradata ETL utilitiesandInformatica.
  • Data Modelling and high Level designing (HLD), Analysis and Preparation of Low level design (LLD).
  • Web Methods installation, clustering, apply patches, and maintenance of application servers.
  • Used Informatica Power Exchange for loading/retrieving data from mainframe system.
  • Write Shell script running workflows in UNIX environment.
  • Modified several of the existing mappings based on the user requirements and maintained existing mappings, sessions and workflows.
  • Tuned the performance of mappings by following Informatica best practices and also applied several methods to get best performance by decreasing the run time of workflows.
  • Prepared SQL Queries to validate the data in both source and target databases.

ENVIRONMENT: Informatica Power Center 10/9.6, EBS, Informatica BDE,Hive 2.7, Teradata 12, SSRS, Oracle 11/10g, PL/SQL, Perl Scripting, Map reduce, Autosys, TOAD 9.x, Informatica Cloud, Oracle Financials, Shell Scripting, python, Dynamic SQL, Oracle SQL *Loader, SSIS 2008 and Sun Solaris UNIX, OBIEE, Windows-XP, Amazon s3, Red hat Linux.

Confidential, Los Angeles, CA

Senior ETL Developer

RESPONSIBILITIES:

  • Design the physical model and ETL’s to source data from current systems.
  • Building the necessary staging tables and worktables on oracle development environment.
  • Designed a flexible ETL process to run as Incremental load or Full Load with minimal tweak.
  • Developed Mappings, Sessions and Workflows to extract, validate, and transform data according to the business rules.
  • Developed mappings/sessions using Informatica Power Center 8.6.1 for data loading.
  • Build out best practices regarding data staging data cleansing and data transformation routines within the Informatica MDM solution
  • Used the PL/SQL procedures for Informatica mappings for truncating the data in target tables at run time.
  • This role carries primary responsibility for problem determination and resolution for each SAP application system database server and application server.
  • Created High level/detailed level design documents and also involved in creating ETL functional and technical specification.
  • Used IKM E-Business Suite to integrate data to E-Business Suite using Open Interface tables.
  • Worked on TOAD and Oracle SQL Developer to develop queries and create procedures and packages in Oracle.
  • Worked extensively on PL/SQL as part of the process to develop several scripts to handle different scenarios.
  • Create reports in Power BI preview portal utilizing the SSAS Tabular via Analysis connector.
  • Created Test cases for the mappings developed and then created integration Testing Document.
  • Used the advanced features of PL/SQL like Records, Tables, Object types.
  • Created programming code using advanced concepts of Records, Collections, and Dynamic SQL.
  • Worked for internal project of Infosys Limited onBizTalkServer (Microsoft Tool) used for Enterprise.
  • Created and Configured Workflows, Work lets, and Sessions to transport the data to target warehouse Netezza tables using Informatica Workflow Manager.
  • Ensured thatintegration testingandsystem testingfollowed with QA and change management procedures.
  • Deployed the latest stable Bugzilla and its customized skin, fully automatedthrough Puppet.
  • Automated the Informatica jobs using UNIX shell scripting.
  • Extensively used Informatica debugger to figure out the problems in mapping. Also involved in troubleshooting existing ETL bugs.
  • Create Oracle 11g database databases and replicate Sybase schema objects to Oracle.
  • Troubleshoot PL/SQL procedures and functions to support corresponding Sybase functionalities.
  • Created the ETL exception reports and validation reports after the data is loaded into the warehouse database.
  • Created Complex ETL Packages using SSIS to extract data from staging tables to partitioned tables with incremental load.
  • Created SSIS Reusable Packages to extract data from Multi formatted Flat files, Excel, XML files into UL Database and DB2 Billing Systems.
  • Developed, deployed, and monitored SSIS Packages.
  • Extensively used SQL* loader to load data from flat files to the database tables in Oracle.
  • Modified existing mappings for enhancements of new business requirements.
  • Wrote UNIX shell Scripts & PMCMD commands for FTP of files from remote server and backup of repository and folder.
  • Created ODI packages, scenarios using interfaces, variables, procedure
  • Involved in Performance tuning at source, target, mappings, sessions, and system levels.

ENVIRONMENT: Informatica Power Center 9.1/8.6.1, Workflow Manager, Workflow Monitor, Informatica Power Connect / Power Exchange, Dynamic SQL, Informatica on Demand (IOD), SSIS/SSRS2008, Control M, Data Analyzer 8.1, Task Factory, Oracle BI Apps 7.9.6.1, Hyperion, Power BI 2013,Oracle Data Integrator 10.1.3.5, PL/SQL, OBIEEE, Sybase, Oracle 10g/9i, Erwin, Autosys, SQL Server 2005, Sybase, UNIX AIX, Toad 9.0, Cognos 8.

Confidential, Dallas, TX

IDQ ETL Developer

RESPONSIBILITIES:

  • Worked with Informatica Data Quality 8.6.1 (IDQ) toolkit, Analysis, data cleansing, data matching, data conversion, exception handling, and reporting and monitoring capabilities of IDQ 8.6.1.
  • Identified and eliminated duplicates in datasets thorough IDQ 8.6.1 components of Edit Distance, Jar Distance and Mixed Field matcher, It enables the creation of a single view of customers, help control costs associated with mailing lists by preventing multiple pieces of mail.
  • Configured designed and delivered MDM hubs across multiple data domains Party Service/Product Prospect
  • Automate the on boarding of new external data sources or trading partners
  • Automate the processing of unstructured data formats (PDF, XLS, DOC, etc)
  • Profiled the data using Informatica Data Explorer (IDE) and performed Proof of Concept for Informatica Data Quality (IDQ).
  • Performed the data profiling and analysis making use of Informatica Data Explorer (IDE) and Informatica Data Quality (IDQ).
  • Data Profiling, Cleansing, Standardizing using IDQ and integrating with Informatica suite of tools.
  • Worked with Informatica Data Quality 9.6.1 (IDQ) toolkit, Analysis, data cleansing, data matching, data conversion, exception handling, and reporting and monitoring capabilities using IDQ 9.6.1
  • Worked on performance tuning of programs, ETL procedures and processes.
  • Involved in Dimensional modelling (Star Schema) of the Data warehouse and used Erwin to design the business process, dimensions and measured facts.
  • Extracted data from oracle database and spreadsheets and staged into a single place and applied business logic to load them in the central oracle database.
  • Used Informatica Power Center for extraction, transformation and load (ETL) of data in the data warehouse.
  • Worked on Power BI and prepared dashboards using Tableau to make it more attractive.
  • Extensively used Transformations like Router, Aggregator, Normaliser, Joiner, Expression and Lookup, Update strategy and Sequence generator and Stored Procedure.
  • Developed complex mappings in Informatica to load the data from various sources.
  • Implemented performance tuning logic on targets, sources, mappings, sessions to provide maximum efficiency and performance.

ENVIRONMENT: Informatica Power Center 9.0/8.x/7.1.3, Informatica Data Quality, Informatica MDM, Power Exchange 8.6.1, Data Explorer 5.0, Oracle 10g/9i, PL/SQL, SSIS/SSRS, Toad 10.5/9.5, Cognos 8.4, Power BI, Puppet, Windows XP pro and AIX UNIX, PVCS, Tidal, Magic ticket management, SQL Server, Teradata SQL Assistant, Teradata external loaders (T pump, Ml oad).

Confidential, Sunnyvale, CA

ETL Developer

RESPONSIBILITIES:

  • Imported various heterogeneous files using Informatica Power Center 8.x Source Analyzer.
  • Developed several reusable transformations and mapplets that were used in other mappings.
  • Prepared Technical Design documents and Test cases.
  • Involved in Unit Testing and Resolution of various Bottlenecks came across.
  • Implemented various Performance Tuning techniques.
  • Gathered business requirements from Business Analyst.
  • Created, modified and formatted requests, along with charts and filters inOBIEE Answers.
  • Designed and implemented appropriate ETL mappings to extract and transform data from various sources to meet requirements.
  • Customized and developed the OBIEEPhysical Layer, Business Model and Mapping layer.
  • Designed and developed Informatica ETL mappings to extract master and transactional data from heterogeneous data feeds and load.
  • Part of STEP ERP project by implementing OracleBI Appsusing ODI and OBIEE application tools.
  • Installed and Configured the Informatica Client tools.
  • Worked on loading of data from several flat files to XML Targets.
  • Designed the procedures for getting the data from all systems to Data Warehousing system.
  • Created the environment for Staging area, loading the Staging area with data from multiple sources.
  • Analyzed business process workflows and assisted in the development of ETL procedures for moving data from source to target systems.
  • Used industry standard technologies like AJAX, GIT, and APACHE2 along withpython-flask and PHP for creating web based wireless testing system.
  • Used PY4J to dynamically access Java objects in android's Java Virtual Machine, wrote java code to callpythonobject using PY4J module.
  • Worked with the transformations like Lookup, Joiner, Filter, Aggregator, Source Qualifier, Expression, Update Strategy and Sequence Generator Transformations.
  • Used Type 1 SCD and Type 2 SCD mappings to update slowly Changing Dimension Tables.
  • Used workflow manager for session management, database connection management and scheduling of jobs.
  • Created UNIX shell scripts for Informatica ETL tool to automate sessions.
  • Monitored sessions using the workflow monitor, which were scheduled, running, completed or failed. Debugged mappings for failed sessions.

ENVIRONMENT: OBIEE 10.1.3.2, OBIA/BI APPS 7.9.6.4, OBIEE DAC, Informatica Power Center 8.6/8.1, Informatica MDM, Oracle 9i, Mainframe(DB2), MS-SQL Server, PL/SQL Developer, python, SSIS/SSRS, Bourne shell, Windows XP, TOAD, MS Office and Delimited Flat files.

Confidential

DW/ETL Developer

RESPONSIBILITIES:

  • Involved in design, development and maintenance of database for Data warehouse project.
  • Involved in Business Users Meetings to understand their requirements.
  • Developed OBIEE Repository with all the three layers: Physical Layer, Business Model and Mapping Layer and Presentation Layer.
  • I am also used UNIX and shell scripting extensively to enhance thePERLscripts and develop, schedule and support Control M batch jobs to schedule the data generation and reporting.
  • By usingPERLand SHELL scripts invoke the stored procedures for data load, computation and generation of reports.
  • Created SSRS reports for Franchise Health Reports.
  • Worked on SSIS Package, DTS Import/Export for transferring data from Database (Oracle and Text format data) to SQL Server.
  • Created complex mappings which involved Slowly Changing Dimensions, implementation of Business Logic and capturing the deleted records in the source systems.
  • Worked extensively with the connected lookup Transformations using dynamic cache.
  • Worked with complex mappings having an average of 15 transformations.
  • Created and scheduled Sessions, Jobs based on demand, run on time and run only once
  • Monitored Workflows and Sessions using Workflow Monitor.
  • Performed Unit testing, Integration testing and System testing of Informatica mappings Coded PL/SQL scripts.
  • Wrote UNIX scripts, Perl scripts for the business needs.
  • Coded Unix Scripts to capture data from different relational systems to flat files to use them as source file for ETL process
  • Created Universes and generated reports on using Star Schema.

ENVIRONMENT: OBIEE 10.1.3.2, Informatica Power Center 8.1, Oracle 11g, SSIS/SSRS 2005, UNIX

We'd love your feedback!