We provide IT Staff Augmentation Services!

Arizona Sr. Informatica Consultant (etl Developer) Resume

ScottsdalE

PROFESSIONAL SUMMARY:

  • 10+ years of total IT experience wherein 7+ years in the Analysis, Design, Development, Testing, Implementation and Production Support for various industries using Data Warehouse/Data Mart Design, Data Analysis, ETL, OLAP, Client/Server and Web applications on Windows and Unix platforms.
  • Informatica: 7+ years of Data Warehousing experience using Informatica Power Center 9.0/8.6.1/8.1.2/8.0/7.1.1/7.1.2/7.0/6.1/5.1.1 , ETL, OLAP, OLTP and Tidal.
  • Expert in writing BTEQ, FAST LOAD and MULTI LOAD scripts according to the business demand with the given transformation or business logics and good exposure on FAST EXPORT, TPUMP, TPT (Teradata Parallel Transporter)etc
  • Worked with Joins, Sub Queries, and Set Operations extensively.
  • Observed the usage of various Teradata features like UPI, USI, and PPI, JOINIndexes.
  • Worked with Explain Command and Collect Statistics and identified JoinStrategies.
  • Identifying long running queries bottlenecks (Spool Space Issues and Skewness etc.…) implementing appropriate tuningmethods.
  • Good understanding of TeradataArchitecture, protection concepts like Locks, Fallback, Journals, RAID, Clique etc...
  • Written Unit Test casesand submitted Unit test results as per the quality process.
  • Gained exposure of Software Development Life Cycle(SDLC).
  • Worked on various Teradata versions like TD12, TD13, and TD14.TD15
  • Experience in debugging and performance tuning of sources, targets, mappings and sessions.
  • And Informatica workflow manager and workflow monitor for creating and monitoring workflows, Worklets and sessions.
  • Designed the ETL processes using Informatica to load data from Oracle, Flat Files, VSAM files and Excel files to staging database and from staging to the target Data Warehouse database.
  • Involved in the Analysis of Physical Data Model for ETL mapping and the process flow diagrams.
  • Data Analysis: 5+ years of strong Business Analysis experience on Data Analysis, User Requirement Gathering, User Requirement Analysis, Data Cleansing, Data Transformations, Data Relationships, Source Systems Analysis and Reporting Analysis.
  • Data Modeling: Dimensional Data Modeling experience on Data modeling, Erwin 4.5/4.0/3.5.5/3.5.2 , Dimensional Modeling, Ralph Kimball Approach, Star/Snowflake Modeling, DataMarts, OLAP, FACT & Dimensions tables, Physical & Logical data modeling and Oracle Designer.
  • Data Profiling and Cleansing: Experience in Data Cleansing using SQL coding.
  • Experience in Developing complex Mappings, Reusable Transformations, Sessions and Workflows using Informatica ETL tool to extract data from various sources and load into target tables.

TECHNICAL SKILL:

Tools: Informatica PowerCenter 8.x/9.x/10.1, Informatica MDM 9.7/10.1,10.2, Informatica IDQ 10.2 Informatica IDD 10.1, SSIS, Data Analysis, DataStage, Toad, Putty, WinScp, PL/SQL Developer, Oracle SQL Developer, Mainframe, Siebel, SAP, JIRA, WINSQL, TWS Scheduler, Apex Exchange, XML expertise, JSON, API analysis and data integration, web services

OFFICE S/W: MS Office( MS Word, MS Excel, MS Power Point)

Data Base: Oracle 10g/11g, DB2, SQL Server, Teradata

Methodologies: Data Modeling - Logical, Physical

Programming: UNIX Shell Scripting, SQL, PL/SQL

Operating Systems: UNIX, Windows XP/7/10

PROFESSIONAL EXPERIENCE:

Confidential, Scottsdale

Arizona Sr. Informatica consultant (ETL Developer)

Responsibilities:

  • Developed, configured, coded, tested and debugged new software solutions.
  • Developed, tested and implemented enterprise datamovement (ETL and CDC) solutions.
  • Scheduled the sessions to extract, transform and load data in to warehouse database on Business requirements from REST API.
  • Worked extensively on HTTP Transformation to extract data from API.
  • Parsing JSON and XML format API from source REST API.
  • Addressed system defects and implemented enhancements to existing functionality.
  • Worked with onshore/offshore team to analyze,develop and improve ETL run times as well as to produce accurate defect free code.
  • Maintained productive working relationships with project sponsors and key systems users.
  • Analyzed the business and functional requirements and provided high level technical design specifications to drive ETL Development efforts.
  • Troubleshooted issues with minimal guidance, identified bottlenecks in existing data workflows and provided solutions for scalable, defect-free applications.
  • Participated in the definition of application scope and objectives through research and fact finding.
  • Provided Tier 3 support and resolution of open IT issues escalated by IT Customer Support.
  • Supported production environment in the event of emergency.
  • Provided performance tuning insight to project team and created reusable objects and templates.
  • Wrote Unix Scripts to invoke Informatica Workflows and sessions.
  • Implemented Error handling logic to capture Invalid/Null records coming from staging tables.
  • Created test plans for unit testing, integration testing and UAT .
  • Managed QA and PROD deployments and automation of ETL Jobs.
  • Provided assistance in diagnosing production problems related to the project.

Confidential, Blue Ash, Ohio

Sr. Informatica consultant (ETL Developer)

Responsibilities:

  • Build partnerships across the application, business and infrastructure teams.
  • Interacted with Business users/customers to confirm the requirements needed for developing and modifying the jobs and to identify the various sources of data in operational systems and developed strategies to build data warehouse.
  • Analyzed the business and functional requirements and provided high level technical design specifications to drive ETL Development efforts.
  • Actively involved in the Design and development of the STAR schema data model.
  • Designed ETL specification documents to load the data in target using various transformations according to the business requirements.
  • Extensively worked with Informatica Power Center to load data from flat files, oracle and other source systems into target database.
  • Designed and developed Complex Informatica mappings using various transformations like Source Qualifier, Expression, Lookup, Aggregator, Update Strategy, Joiner, and Rank for populating target tables/Files in an efficient manner.
  • Designed and developed ETL workflows with job dependencies and scheduling, participated in code reviews of ETL processes.
  • Created Visio diagrams of ETL process to include in the design documents.
  • Created check-lists for coding, reviewing, bug logging, troubleshooting, testing and release for smooth functioning of the entire project.
  • Created and Configured Workflows, Worklets, and Sessions to load the data to target warehouse Oracle tables using Informatica Workflow Manager.
  • Wrote Unix Scripts to invoke Informatica Workflows and sessions.
  • Scheduling the sessions to extract, transform and load data in to warehouse database on Business requirements.
  • Implemented Error handling logic to capture Invalid/Null records coming from staging tables.
  • Tuned and optimized ETL jobs for performance and throughput.
  • Created test plans for unit testing, integration testing and UAT .
  • Developed SQL Scripts and ad-hoc queries for data verification and validation processes.
  • Managed QA and PROD deployments and automation of ETL Jobs.
  • Provided assistance in diagnosing production problems related to the project.

Confidential, Lake Forest, IL

Sr. Informatica consultant (ETL Developer)

Responsibilities:

  • Led the Data Integration team to drive detailed ETL and data requirements to ensure accurate capturing of client's business requirements and deliverables.
  • Involved in strategy development, project planning, and resource planning, allocation and budget management.
  • Acted as the Subject Matter Expert for ETL and Sales Comp.
  • Collaborated with IT and Business Teams to gather high-level Integration/Compensation requirements.
  • Designed and built ETL solutions to automate the data feeds from client systems
  • Go to guy on all data integration project related questions in Professional Services
  • Took the lead on building custom Process Queuing system for the data Integration processes giving customers the visibility into the current status of their processes.
  • Worked with Cross Functional Teams on various initiatives.
  • Involved in all phases of our company’s evolution from a Start Up to an Enterprise.
  • Extensively worked on standardizing our methodologies to meet the industry standards with focus on quality, scalability and reusability.
  • Always lived up to our Core values of CARE (Customer Focus, Accountability, Respect and Excellence).

Environment: Informatica Power Center 9.0, Oracle 10 g, Flat files, Git migration, STASH, UNIX, Shell Scripting, Xactly Connect, JIRA

Confidential, Chicago, IL

ETL Informatica Developer

Responsibilities:

  • Participated in daily/weekly meetings, monitored the work progresses of teams and proposed ETL strategies.
  • Involved in complete Life Cycle of developing Enterprise Data Warehouse Application and, developing ETL Architecture using Informatica.
  • Parsed high-levels design specification of ETL coding. Developed new SCD Type1/Type2 complex mappings, fixed the old mappings into different layers and proposed strategies for future growth of the data.
  • Fixed the existing mappings on bug/defects and developed new mappings based on new requirements. Figured out the bottle necks of existing process and optimized them to accelerate the processes.
  • Migrated the codes from Dev to Test to Prod environment and wrote up the Team Based Development technical documents to smooth transfer of project. Prepared ETL technical Mapping documents along with test cases for each Mapping for future developments to maintain SDLC.
  • Created Pre-& Post-Sessions UNIX Scripts, Functions, Triggers and Stored Procedures to drop & re-create the indexes and to solve the complex calculations on data. Responsible to transform and load of large sets of structured, semi-structured and unstructured data from heterogeneous sources.
  • Extensively worked on CDC to capture the data changes into sources and for delta load. Used Debugger to validate the Mappings and gained troubleshooting information about the data and error conditions.
  • For each Mapping prepared effective Unit, Regression, Integration and System test cases for various stages to capture the data discrepancies/inaccuracies to ensure the successful execution of accurate data loading.
  • Peer-reviewed the code to cross-check if the logics are accurate or not to meet the business requirements and client standardization based on Tech Specs, fixed them if there are any discrepancies. Identify the feasible alternative approaches, systems and equipment reduces cost, improve efficiency while meeting the expectations.

Environment: Informatica PowerCenter 9.6.1 HF2, SQL-Server-2012R, DB2, SQL, PL/SQL, TOAD 9.5, WinScp, UNIX Shell Scripting, SQL-Developer

Confidential

Sr. Data Analyst (ETL)

Responsibilities:

  • Interacted with Source system SME s to analyze how various business processes have been tracked across the source tables.
  • Defines project requirements by identifying project milestones, phases, and elements; forming project team and establishing project budget.
  • Performed data analysis, system conversions and integrations.
  • Developed sessions using different types of partitioning techniques like database, pass through portioning for better performance.
  • Implemented audit process to ensure Data warehouse is matching with the source systems in all reporting perspectives.
  • Used SQL Assistant front - end tool to issue SQL commands matching the business requirements to run reports for data on Providers.
  • Provide data analysis support to the fraud investigation team in support of their investigation leads.
  • Troubleshoot client and operational issues timely and appropriately.
  • Identified performance issues in existing sources, targets and mappings by analyzing the data flow, evaluating transformations and tuned accordingly for better performance.
  • Served as a process expert for a defined functional area, which includes:
  • Managing key data sources & inputs
  • Planning and ensuring implementation of end state automation activities
  • Providing guidance on the design, development, and implementation of automated processes
  • Involved in Unit testing, User Acceptance testing to check whether the data loads into target are accurate, which was extracted from different source systems according to the user requirements.
  • Prepared the Standard Operating Procedure (Knowledge Transfer) document, which provides necessary information, required for the Maintenance and Operation of the application.
  • Provided Nightly batch loads support and implemented solutions to correct the data issues raised by end user during production support phase of the project.
  • Coordinated with the client and gathered the user requirements and developed the reports in text files by translating the business validations.

Environment: SQL Server 2008R2, DB2, Flat files, SharePoint, Windows 7 Professional, UNIX, Oracle10g, Windows 2000.

Confidential

QC Mechanical Engineer

Responsibilities:

  • Conducting non - destructive tests for the welded joints as per field welding schedule Submission of QA documents to the potential clients
  • Calibration of automated ultrasonic testing on line aut (SOFRATECH FRANCE)
  • Visual Inspection of the welded joints regularly.
  • Performs magnetic particle testing on bevel of the pipe as per GAIL specification.
  • Carried visual and dimensional inspections.
  • Selection of DAC blocks for UT, Probe selection.
  • Witnessing hydro test and carrying it out as per client specification.
  • Submission of daily progress report to the TPI (EIL)
  • Witnessing UT machine calibration to the TPI.
  • Giving demo of AUT to the TPI as per GAIL specification.
  • Submission of daily progress report after witness of TPI.
  • Witnessing calibration of UT machines for the scanning.
  • Visual inspection of pipe mechanical properties.

Hire Now