We provide IT Staff Augmentation Services!

Senior Etl Developer Resume

0/5 (Submit Your Rating)

Richmond, VA

SUMMARY

  • Over 9+ years of IT experience with extensive Data Warehousing implementations across Banking, Health Care, Financial, Telecommunications and Government.
  • Extensive Data Warehousing experience in Business Requirement Analysis, Design, Development, Testing, Implementations, Loading, Maintenance, Enhancements of Data warehouse, and ODS systems usingInformatica9.x/8.x/7.x/6.x, Oracle, DB2, SQL Server, Teradata, PL/SQL, and well versed with Data Warehousing Architecture Technology, Operational Data Store, and Dimensional Data Modeling.
  • Two Years (2) of Dimensional Data Modeling using Star & Snow Flake schema. Extensive Experience with Ralph Kimball and Bill Inmon Methodologies. Designed Data models using Erwin 4.
  • Three years (3) of experience working on Business Intelligence tools like OBIEE/Siebel Analytics, Business Objects, Cognos and Tableau.
  • Strong understanding of OLAP and OLTP Concepts.
  • Good experience with Installation and Configuration fordomainrepository service and Integration service and administration tasks like creating users, folders, backups, Migrations
  • Experience in working both Waterfall & Agile Methodologies.
  • Experience working in Multi - Terabyte environments. Solid Experience in coding using SQL, SQL * plus, PL/SQL Stored procedures/functions, triggers and packages. Extensive Working experience applying Relational Database concepts, Entity Relation diagrams, Data Flow diagrams, Oracle Designer 2000 and Normalization concepts. Expertise in utilizing Oracle Utility tools viz., SQL*Loader, Import and Export. Expert in Unix Shell Scripting.
  • Having solid Experience in Informatica and Teradata mix in Enterprise distribution center environment and solid involvement in utilizing Teradata utilities like TPT, FASTLOAD, MULTILOAD and BTEQ scripts.
  • Experience in utilizing Automation Scheduling instruments like Autosys and Control-M.

TECHNICAL SKILLS

DWH Tools: Informatica PowerCenter 9.x/8.x, Power Exchange 9.x/8.x, Informatica IDQ 9.x,Informatica MDM,Informatica MDM Hub Console, OLTP, OLAP and SAP 5.x.

Data Modelling: Erwin 4.0/3.5

BI Tools: Business Objects 6.0,Cognos 10.1 and Tableau

Programming: SQL, PL/SQL, Transact SQL, UNIX Scripting & Python Scripting

Databases: Oracle 11g/10g/9i, SQL Server 2012/2010, Teradata 13/12, DB2, Netezza 4.x& MS Access

Scheduling Tools: Autosys and Control-M.

Other Tools: Toad, SharePoint, Putty, GIT, MATTOperating Systems Linux, UNIX, SUN Solaris, Windows7/XP/2000/98

PROFESSIONAL EXPERIENCE

Confidential, Richmond, VA

Senior ETL Developer

Responsibilities:

  • Involved in requirement analysis, ETL design and development for extracting data from the source systems like Oracle, flat files, XML files and loading into EDW (Enterprise Data Warehouse).
  • Responsible for design and implementation needed for loading and updating the warehouse.
  • Worked with analysts and data source systems experts to map requirements to ETL code.
  • Developed complex mapping logic using various transformations like Expression, Lookups (Connected and Unconnected), Joiner, Filter, Sorter, Router, Update strategy and Sequence generator.
  • Converted functional specifications into technical specifications (design of mapping documents).
  • Worked on developing UNIX Scripts for job automations.
  • Used Autosys, CRON to schedule jobs.
  • Created deployment groups to deploy objects.
  • Worked with Session logs and Work flow logs for Error handling and troubleshooting in Dev environment.
  • Efficiently used Informatica Workflow Manager, Workflow monitor to create, schedule and control Workflows, Tasks, and Sessions.
  • Created scripts for better handling of incoming source files such as moving files from one directory to another directory and extracting information from file names, such as date, for continuously incoming sources.
  • Understanding of existingmainframeCOBOL programs to develop Detailed job flow diagrams, Design Documents
  • Performance tuning, maintain and fix production issues of existing code. Modify existing code as per the new business requirements.
  • Performed Development usingTeradatautilities like BTEQ, Fast Load, MultiLoad and TPT to populate the data into BI DW.
  • Generated reports usingTeradataBTEQ.
  • Worked on optimizing and tuning theTeradataSQLs to improve the performance of batch and response time of data for users.
  • Fast Export utility to extract large volume of data and send files to downstream applications.
  • Developed Tpump scripts to load low volume data intoTeradataRDBMS at near real-time.
  • Created BTEQ scripts to load data fromTeradataStaging area toTeradata
  • Performance tuning forTeradataSQL statements usingTeradataEXPLAIN
  • Used different Data profiling techniques for better Data analysis. i.e. Colum profiling and filter options for better Data over view and identifying data anomalies.
  • PerformedTeradataSQL assistant Import and Export utility to move data from production to Development to refresh staging tables.
  • Good understanding of relational database management systems like Oracle, Teradata and extensively worked on Data Integration using Informatica for the Extraction transformation and loading of data from various database source systems and mainframe Cobol and VSAM files.
  • Wrote complex SQL scripts to avoid Informatica joiners and Look-ups
  • Created work tables, global temporary tables, volatile tables as part of developing the script/code.
  • Worked on Teradata SQL, BTEQ, MLoad, FastLoad, and FastExport for Ad-hoc queries, and build UNIX shell script to perform ETL interfaces BTEQ, FastLoad orFastExport. Created numerous Volatile, Global, Set, MultiSet tables.Created batch jobs for Fast Export.
  • Migrated code/objects from the development environment to the QA/testing environment to facilitate the testing of all objects developed and check their consistency end to end on the new environment.
  • Involved extensively in Unit testing, integration testing, system testing and UAT.

Environment: Informatica Power Center10.1/9.6.2, IDQ 10.1, Informatica MDM,Teradata, PostgreSQL, Oracle,AWS and Javascript.

Confidential, Austin, Texas

Senior/LeadETL Developer

Responsibilities:

  • Worked with ETL Architects and Senior developers for understanding the requirements.
  • Involved in massive data profiling using IDQ (Analyst tool) prior to data staging.
  • Worked with the team to design, develop, test & implement system. Creation of database objects like tables, procedures using Oracle tools likePL/SQL, TOAD. Written Stored Procedures usingPL/SQL
  • Created complex mappings in PowerCenterDesigner using Aggregate, Expression, Filter, Sequence Generator, Update Strategy, Rank, Joiner and Stored procedure transformations.
  • Implemented several database triggers, functions inOracleSQL, PL/SQL to improve performance and efficiency of large complex queries.
  • Developed an ETLInformaticamapping to load data into staging area. Extracted fromMainframefiles and databases and loaded into Oracle 11g target database.
  • DevelopingInformaticaETL mappings using complex transformations, Mapplets to Extract, Transform and load data into Operational database from legacymainframesystems.
  • Used the advanced features ofPL/SQLlike bulk collect, bulk Bind, table partitioning, collections, pragma autonomous transaction, Indexes, AWR and ADDM reports, Hints, Analyze, Records, Tables, Object Types and DynamicSQLto improve the query performance using these strategies.
  • Identified and fixed the Bottle Necks and tuned the Mappings and Sessions for improving performance. Tuned both ETL process as well as Databases.
  • Worked extensively on developing procedures, functions, packages, bulking techniques, pipeline functions, materialized views, Table Indexing, Index Organized Tables (IOT),SQL*Loader and ETL tools etc.
  • Deployed mappings and mapplets from IDQ to power center for scheduling.
  • Developed shell scripts, PL/SQL procedures, for creating/dropping of table and indexes of performance for pre-andpost-session management.
  • Implemented slowly changing dimensions - Type I, II &III in different mappings as per the requirements.
  • UsedCognos10.1 BI Framework Manager to Build Models (Semantic layers, Cardinality, Relationships, Query subjects), Packages and publish packages toCognosconnection and implemented security for the packages.
  • Organizes prioritize, and track incidents and support requests through Service-Now Tool per SLA’s and corporate policy.
  • Involved in providing feedback and recommendations for managingmetadatafrom the variousmetadatasources and in corporate Velocity standards forMetadataManagement.
  • Involved in creation of various reusable custom models for creating themetadatacatalogs
  • Created complex mappings using Designer to pull themetadatafrom csv files of various application systems using ETL toolInformatica.
  • Created prompts, Calculations, Filters, Developing Prompt pages and Conditional Variables using Report Studio.
  • Tested and validated the Report output against database to measure performance of reports.
  • Migrating mappings, workflows and parameter files from development to production.
  • Configured Trust and created Queries & package to integrate the Trust score.
  • Involved in creating various Match rules. Two match rules, Fuzzy match will be set based on token known as tokenization and Exact match will give exact record information.
  • Worked on creating Unit testing documents.

Environment: s: Informatica PowerCenter 9.6.2, IBM Optim 9.1,Sql Server 2012, Hadoop,Oracle,MS-Excel, Cognos, UNIX Shell Scripting, WinSCP.

Confidential, NJ

InformaticaIDQ Developer

Responsibilities:

  • Worked on Designer tools like Source Analyzer, Target Designer, Transformation Developer, Mapping Designer and Workflow Designer.
  • Worked with Informatica Data Quality 9.6 (IDQ) toolkit, Analysis, data cleansing, data matching, data conversion, exception handling, porting and monitoring capabilities of IDQ 9.6.
  • Involved in Debugging and Performance tuning of targets, sources, mappings and sessions.
  • Profiled the data using Informatica Data Explorer (IDE) and performed Proof of Concept for Informatica Data Quality (IDQ).
  • Developed various Mappings, Mapplets, Workflows and Transformations for flat files and XML.
  • Created and Configured Workflows, Worklets, and Sessions to transport the data to target warehouseNetezzatables usingInformaticaWorkflow Manager.
  • Experience in using Bulking techniques (FOR ALL, BULK COLLECT) to improve performance
  • Developed ETL process, database packages, scripts and programs usingPL/SQLand other programming and scripting language.
  • Reviewed and designed the MDM/ActiveVOSsetup and performed development and deployment
  • Configured Landing, staging tables and Base Objects.
  • Design, document and configure theInformaticaMDMHub to support loading, cleansing of data.
  • Worked on data cleansing and standardization using the cleanse functions inInformaticaMDM.
  • Defined Match rules in Match and Merge settings of the base tables by creating Match Path Components, Match Columns and Rule sets.
  • Setup SIF for java application communication and interfacing the IDD as required
  • Deployed newMDMHub for portals in conjunction with user interface on IDD application.
  • Used Transformations like Look up, Router, Filter, Joiner, Stored Procedure, Source Qualifier, Aggregator and Update Strategy extensively.
  • Responsible for determining the bottlenecks and fixing the bottlenecks with performance tuning usingNetezzaDatabase.
  • Extract the data fromSiebelas needed by the external systems into flat files and staging tables
  • Involved with Data Steward Team for designing, documenting and configuringInformaticaData Director for supporting management ofMDMdata
  • Created Workbooks and dashboards for analyzing statistical billing data usingTableau.
  • Created multiple visualizations like Heat Maps, Bar Graphs and Bullet Graphs.
  • Preparing Dashboards using calculations, parameters inTableau.
  • CreatedNetezzaSql scripts to test the table loaded correctly.
  • Utilized the Informatica Data quality management suite (IDQ and IDE) to complete initial data profiling andto identify and merge customers and addresses.
  • Cleansed, standardized, labeled and fix the data gaps in IDQ where it checks with tables to fix major business issues.
  • Profiled the data using Informatica Data Explorer (IDE) and performed Proof of Concept for Informatica Data Quality (IDQ).
  • Created various rules in IDQ to satisfy the Completeness, Conformity, Integrity and Timeliness.
  • Designed Mappings usingB2BData Transformation Studio.

Environment: s: InformaticaPowerCenter9.6, Informatica IDQ 9.6,Informatica MDM 9.6, Informatica B2B, Oracle 11g,Netezza 4.2, ServiceNow, Tableau, UNIX, FTP, Toad.

Confidential, Los Angeles, CA

ETL/Informatica Developer

Responsibilities:

  • Designed ETL's to extract data from COBOL Files and update the EDW with the Patient related Clinical information. This includes a one-time history load and subsequent daily loads.
  • Worked on PowerExchange for importing source data to the Power center.
  • Created mappings to read COBOL source file and write it in ASCII file format. Target file is same as source structure.
  • Extract data from Mainframes using Power Exchange and Cobol files for creating specific reports for clients which can be related to Medicaid Claims.
  • Generated complex Transact SQL (T-SQL) queries, Sub queries, Co-related sub queries, Dynamic SQL queries etc.
  • Created programming code using advanced concepts of Records, Collections and DynamicSQL
  • Performance Tuning of Stored Procedures and SQL queries using SQL Profiler and Index Tuning Wizard in SSIS.
  • Created sprint plan for developer’s responsibilities and scheduling sprint plan in version one.
  • Developed and deployed SSIS packages for ETL from OLTP and various sources to staging and staging to Data warehouse using Lookup, Fuzzy Lookup, Derived Columns, Condition Split, Term, Slowly Changing Dimension and more. Performed ETL mappings using MS SQL Server Integration Services.
  • Extracted and reviewed data from heterogeneous sources from OLTP to OLAP using MS SQL Server Integration Services (SSIS).
  • Worked with Session logs and Workflow logs for Error handling and troubleshooting in Dev environment.
  • Created mapping documents with business rules using MATT Tool.
  • Created list reports, cross tab reports, chart reports and reports with features like conditional formatting, Page break, Master Detail, drill through, drill up, drill down and have extensively used the other features inCognos.
  • Created physical and logical structure of the new data warehouse (SQL database) and process work flow by usingMSVisio.
  • Created complex custommetadatamodels and templates usingInformaticaMetadataManager
  • Optimizing and doing performance tuning ofMetadataresources to achieve higher response times.
  • Involved in creatingMetadatamanager templates, resources, catalogs.
  • Developed filters, calculations, prompts, conditions, and created various reports, usingCognosReport Studio for users.
  • Worked on GIT Bash for code check in into the different environments using GIT commands.
  • Used Spash to verify checkin code in different environments.
  • Worked with creation of Users, Groups, Roles and grant privileges to the users. Create folders, Relational and Application connections and configure the ODBC connectivity.

Environment: InformaticaPowerCenter9.5, Power Exchange 9.5, Oracle, SQL*PLUS, MS SQL Server 2012,MS BI Stack,SSMS, SSIS,UNIX, Data Marts, UNIX Shell Scripting, MATT, Spash, Version one and GIT.

Confidential, Dallas, TX

Informatica Developer

Responsibilities:

  • Developed various Mappings, Mapplets, and Transformations for the Data warehouse.
  • Involved in creation of Data Warehouse database (Physical Model, Logical Model) using Erwin data modeling tool.
  • Extracted data from Teradata in toInformaticaPower Center version and develop the code using different transformations and Loading to Landing area Teradata.
  • Worked on loading of data from several flat files sources using Teradata MLOAD & FLOAD in Teradata.
  • Set up batches and sessions to schedule the loads at required frequency using PowerCenterWorkflow manager.
  • Used Teradata manager, Index Wizard and PMON utilities to improve performance.
  • Extensively worked on Autosys to schedule the jobs for loading data.
  • Multiload, BTEQ, created & modified databases, performed capacity planning, allocated space and granted rights for all objects within databasesetc.
  • Worked on Power Exchange for change data capture (CDC).
  • Worked on Teradata RDBMS using FASTLOAD, MULTILOAD, TPUMP, FASTEXPORT, MULTILOAD EXPORT, Teradata SQL and BTEQ Teradata utilities.
  • Creating and modifying MULTI LOADS for Informatica using UNIX and Loading data into IDW.
  • Designed and Developed Oracle PL/SQL and UNIX Shell Scripts, Data Import/Export.
  • Responsible for the Data Cleansing of Source Data using LTRIM and RTRIM operations of the Expression Transformation.
  • Designed Schemas using Fact, Dimensions, Physical, Logical, Alias and Extension tables inOBIEE Administrator tool.
  • Defined the program specifications for the data migration programs, as well as the necessary test plans used to ensure the successful execution of the data loading processes.

Environment: Informatica PowerCenter9.1, Power Exchange 9.1, Teradata 13.10, Oracle 11g, Data Marts, Erwin Data Modeler 4.1, OBIEE11.1.1.0, UNIX Shell Scripting, Data Modeling, Autosys.

Confidential

Jr. Data Warehouse Developer

Responsibilities:

  • Experience on working with complete Software Development Life Cycle of the application.
  • Worked with Business Analyst and application users to finalize Data Model, functional and detailed technical requirements.
  • Created detailed Technical specifications for Data Warehouse and ETL processes.
  • Used Transformations like Look up, Router, Filter, Joiner, Stored Procedure, Source Qualifier, Aggregator and Update Strategy extensively.
  • Tuned performance of Informatica PowerCentersession for large data files by increasing block size, data cache size, sequence buffer length and target based commit interval.
  • Involved in doing error handling, debugging and troubleshooting Sessions using the Session logs, Debugger and Workflow Monitor.
  • Monitoring the Data Quality, generating weekly/monthly/yearly statistics reports on production processes - success / failure rates for causal analysis as maintenance part and Enhancing exiting production ETL scripts.
  • Worked with SAP and Oracle sources to process the data.
  • Worked on SAP for data migration Human Resources and Finance and converted various objects on Organizational Structure, Addresses, Time, Basic Pay, Bank Details, Recurring Payments, Tax assignment, Insurance Plans, Payroll etc., to generate report from SAP BI system.
  • Worked with Pre-andPost-Session SQL commands to drop and recreate the indexes on data warehouse using Source Qualifier Transformation of Informatica power center.
  • Created Unix Shell Scripts to automate sessions and cleansing the source data.
  • Implemented pipeline partitioning concepts like Round-Robin, Key-Range and Pass Through techniques in mapping transformations.
  • Involved in Debugging and Performance tuning of targets, sources, mappings and sessions.
  • Worked on the data masking activities for cloning the GDW for several buyers

Environment: InformaticaPowerCenter8.6, Oracle9i, Teradata v2r6, SAP, SAP BI 7.0, Sun Solaris.

We'd love your feedback!