- Software professional with 10 years of experience in ETL Developer, experience in all phases of life cycle of Requirement Analysis, Functional Analysis, Design, Development, Implementation, Testing, Debugging, Productions Support, and Maintenance of various Data Warehousing Applications.
- 8 years of experience in ETL Data warehousing using Informatica Power Center 9.6.1/9.0.1 /8. x, Informatica Power Exchange 9.6.1/9.0.1, MDM 9.6,Informatica Data Quality 9.6.1; Which involvePlanning, Designing, Developing and Implementing Data warehouses/ Datamart with experience of both relational & multidimensional.
- 5+ years of experience in Data Quality, Profiling, Validations, reference check and exception handling using Informatica Data Quality.
- 3+ years of experience in Change Data Capture (CDC) Methodology using PowerExchange 9.0/8. x.
- 3+ years of experience in Teradata 14/13 SQL and utilities like MLOAD, Fast Load, TPT.
- 4+ years of experience in ORACLE PL/SQL extensively wrote code for packages, functions &procedures
- Expert in writing optimizeSQL query using ORACLE, SQL Server, Teradata, MySQL and Hive.
- Exposure in writing SQL using analytical functions like Ranking Functions, Reporting Aggregate Functions, LAG/LEAD Functions, FIRST/LAST Functions etc.
- Experience on Informatica Cloud Integration for Amazon Redshift and AWS S3 bucket.
- Excellent experience in Performance Analysis and SQL query tuning while designing scalable applications from scratch and maintaining existing ones, using Query Optimizer, Indexes, Execution Plans, Hints, Explain Plan and Tk - prof, XML, XSL.
- Experience in creating Transformations and Mappings using InformaticaDesigner and processing tasks using WorkflowManager to move data from multiple sources into targets and Data marts and Data warehouse.
- Expert in performance tuning of Informatica mappings, identifying source and target bottlenecks.
- Experience in BillInmon and Kimball data warehouse design and implementation methodologies.
- Expertise in OLTP/OLAP System Study, E-R modeling, developing Database Schemas (Star schema and Snowflake schema) used in relational and dimensional modeling.
- Extensive Experience in integration of various data sources like ORACLE 12c/11g/10g, Teradata 14/13, Netezza, UDB DB2, Mainframes, SQL server 2012, SAP, Sybase, Informix, MySQL, Flat Files, MQ series and XML, Flat files.
- Implemented Slowly Changing Dimensions (SCD) Type 1 and Type 2 for initial and history load using Informatica.
- Expert in Performance tuning, troubleshooting, Indexing and partitioning techniques on Sources, Targets, Mappings and Workflows in Informatica.
- Expertise in Teradata utilities like Multi Load, Fast Load, Fast Export, BTEQ, TPUMP, TPT and tools like SQL Assistant, Viewpoint.
- Well-versed in tuning Teradata ETL queries, remediating excessive statistics, resolving spool space issues, applying compression for space reclamation etc.
- Proficient in Data Analysis, Data Validation, Data Lineage Data Cleansing, Data Verification and identifying data mismatch.
- Extensive experience in Data analysis, ETL Techniques, MD5 logic for loading CDC.
- Experienced in writing UNIX shell scripts, SQL Loader, Procedures/Functions, Triggers and Packages.
- Skilled in creating & maintaining ETL Specification Documents, Use Cases, Source to target mapping, Requirement Traceability Matrix, performing Impact Assessment and providing Effort estimates, deployment artifacts.
- Worked many years in onsite offshore model. Provided technical design leadership to ensure the efficient use of offshore resources and the selection of appropriate design, ETL/CDC logic.
- Extensive experience of providing IT services in Healthcare and Banking industries.
- Exposure of end to end SDLC and Agile methodology.
- Strong Knowledge of Hadoop Ecosystem (HDFS, HBase, MapReduce, Hive, Pig, NoSQL etc.)
- Strong experience in handling multiple projects, in multi-vendor environment Confidential the same time with full customer satisfaction.
- Excellent Documentation Skills, Team problem solving ability, Analytical and Programming skills in High speed, Quality conscious and Multi tasked environment.
Frontend Environment: Cognos, Business Objects, SAS, MicroStrategy, OBIEE
ETL: Informatica PowerCenter, Informatica IDQ, SSIS
Programming Languages: PL/SQL, JAVA, Perl, VBA, VB, UNIX, SSIS, Python
Databases: MS Access, Oracle 11g, Teradata, DB2, Sybase
Data Tools: Erwin, SAS Data Flux, Informatica IDQ
Confidential, Warren, NJ
Sr. Informatica developer
- Co-ordination from various business users’ stakeholders and SME to get Functional expertise, design and business test scenarios review, UAT participation and validation of data from multiple source.
- Contributed in creation of the BRD/functional and technical design document after having multiple sessions with business users’ team.
- Defined and developed brand new standard design patterned, ETL frameworks, Data Model standards guidelines and ETL best practices
- Provided technical design leadership to this project to ensure the efficient use of offshoreresources and the selection of appropriate ETL/CDC logic.
- Performed detailed data investigation and analysis of known data quality issues in related databases through SQL
- Actively involved in Analysis phase of the business requirement and design of the Informaticamappings.
- Performed data validation, data profiling, data auditing and data cleansing activities to ensure high quality report deliveries·
- Developed ETL routines using Informatica Power Center and createdmappings involving transformations like Lookup, Aggregator, Ranking, Expressions, Mapplets, connected and unconnected stored procedures, SQL overrides usage in Lookups and source filter usage in Source qualifiers and data flow management into multiple targets using Routers.
- Extensively used ETL processes to load data from various source systems such as ORACLE, DB2,SQL Server, Teradata and Flat Files, XML files into target system SQL Server by applying business logic on transformation mapping for inserting and updating records when loaded.
- Involved in developing scripts, routines for database consistency checks, and bulk validation for legacy systems.
- Proficient in Performance tuning of multiple Informatica mappings to improve performance as the volume of Internet orders entered the system on a daily basis. The factors involved were PDO (Push down optimization), bulkloading and also adding indexes on the stagingtables.
- Using the PowerExchange for ChangeDatacapture (CDC) option to capture the data whenever the inserts, updates, and deletes underlying these events as soon as they occur and transfer in to multiple targets without intermediatequeues or stagingtables.
- Teradata utilizes like BTEQ, FASTLOAD, MLOAD, and FASTEXPORT.
- Sourced data form RDS and AWS S3 bucket and populated in Teradata target.
- Mounted S3 bucket in local UNIX environment for data analysis.
- Worked with Power Center team to load data from external source systems to MDM hub 9.6.
- MDM hub configurations - Data modeling, Data Mappings, Data validation, Match and Merge rules, Hierarchy Manager, customizing/configuring.
- Extensively used InformaticaWorkflowManager and WorkflowMonitor to develop, manage and monitor workflows and sessions.
- Proposed and developed multiple cleansing functions in MDM hub and IDQ to get the required cleansed/standardized output.
- Used the database partitioning and Informatica partitioning for better performance when working with the huge data.
- Used XSL for transforming the XML files in implementing the front-end template system.
- Hands on experience with Informatica Power center and implementing data quality rules with IDQ.
- Used CA7 scheduling tool for scheduling the Informatica workflows.
- Understand Session Log files in case the session failed to resolve errors in mapping or session configurations
- Written Unix shell scripts for file manipulation, ftp and to schedule workflows.
- Co-ordinated offshore team on daily basis to leverage faster development.
Environment: InformaticaPower Center 9.6.1/9.5.1 (Repository Server, Repository Manager, Designer, Workflow Manager and Workflow Monitor), MDM HUB 9.6, IDQ 9.1, Teradata 14.1.0, Oracle 11g, ORACLE 11g, AWS RDS, S3 Bucket, Toad, PL/SQL and SQL Server 2008, QlikView, SQL, UNIX Shell Scripts, XML, XSL.
Confidential, Atlanata, GA
Sr. Informatica Developer
- Worked with business analyst for requirement gathering, business analysis and testing and project- coordination using interviews, document analysis, business process descriptions, scenarios and workflow analysis.
- Created Technical Design Document or Minor Release Document (MRD) from business requirements document (BRD) or Functional Requirement Document (FRD) business analyst based on business objectives, facilitated joint sessions.
- Analyzed business and system requirements to identify system impacts.
- Created flow diagrams and charts.
- Validated, Standardized and cleaned data as per the business rules using IDQ.
- Initiated the process of Data Profiling by Profiling different formats of data from different sources and users.
- Created the Detail Technical Design Documents which have the ETL technical specifications for the given functionality, overall process flow for each particular process, Flow diagrams, Mapping spreadsheets, issues, assumptions, configurations, Informatica code details, Database changes, shell scripts etc. and conducted meetings with the business analysis, clients for the Approval of the process.
- Analyzed the existing mapping logic to determine the reusability of the code.
- Handled versioning and dependencies in Informatica.
- Created Mapping Parameters, Session parameters, Mapping Variables and Session Variables.
- Translated the PL/SQL logic into Informatica mappings including Database packages, stored procedures and views.
- Administered InformaticaPowercenter and change data capture software which was used for data integration.
- Performed Admin Jobs and Tasks, reviewed logs, Deployed ETL Codes.
- Involved in extensive performance tuning by determining bottlenecks Confidential various points like targets, sources, mappings, sessions or system. This led to better session performance.
- Created and maintained the Shell Scripts and Parameter files in UNIX for the proper execution of Informatica workflows in different environments.
- Set up Users and User accounts as per requirements.
- Worked Extensively with version control software(Github)
- Actively Participated in Data Quality services and frameworks for Data Quality
- Created UNIX scripts to read/write and ftp files from and to windows servers and UNIX.
- Created Unit test plans and did unit testing using different scenarios separately for every process. Involved in System test, Regression test and supported the UAT for the client.
- Performing ETL and database code migrations across environments using deployment groups.
- Populating the business rules using mappings into the target tables.
- Developed Informatica Data Quality Mappings, sessions, workflows, scripts and orchestration Schedules.
- Involved in end-to-end system testing, performance and regression testing and data validations.
- Worked extensively on modifying and updating existing oracle code including object types, views, PL/SQL stored procedures and packages, functions and triggers based on business requirements.
- Worked in agile minor release cycles the designated database developer.
- Unit test and support QA and UAT testing for database changes.
- Managed performance and tuning of SQL queries and fixed the slow running queries in production.
- Help support Data masking projects for DRD for all Dev, QA and UAT environments via Enterprise Data Obfuscation.
- Build creation, verification and deployment to QA and UAT using Transporter.
- Created batch scripts for automated database build deployment.
- Extensively worked on Teradata
Confidential, Washington DC
- Worked with the TechManager and BusinessAnalysts of the team to understand prioritization of assigned trouble tickets and to understand business user needs/requirements driving the support requests.
- Performed problemassessment, documentation, performance review and qualityvalidation on new and existing BIenvironments.
- Created the ETLtechnicalspecification for the effort based on business requirements. The source system being mainly OracleMasterData.
- Worked closely with Businessusers, BusinessAnalyst and SharedServicesTechnicalLeads in defining technical specifications and designs for Oracle based largedatawarehouseenvironment.
- Developed detailed ETLspecificationbased on technicalspecification for BI effort within the ETL design standards and guidelines.
- Served as S&P’s ETLexpert - performingtechnical development, maintenance, tuning, and support activities.
- Ensured testing data is available for unit, system and acceptance testing within development and QA environments.
- Unit tested ETL code/scripts to ensure correctness of functionality and compliance with business requirements.
- Refined ETL code/scripts as needed to enable migration of new code from development to QA and QA to production environments following the standard migration signoff procedure.
- Performed system testing, integration testing and helped user for UAT based on business requirements.
- Coordinated among multiple teamETLAdmin, UXIX admin and management team for approval for ETLdeployment.
- Reviewed ETL development, work closely and drive quality of implementation - ensured unit testing is being completed and quality audits are being performed on the ETL work.
- Designed the ETL specification documents to gather workflows information from offshore and shared with Integration and production maintenance team.
Environment: Oracle 11g/10g/9i/8i, SQL, PL/SQ, UNIX, Shell scripting, XML, XSL, Informatica Power Center 8.6, .net, crystal reports, Oracle Hyperion Essbase and Planning, MS SQL Server, COBOL, JCL.
Confidential, Calabasas, CA
- Involved in design of database and created Data marts extensively using StarSchema.
- Worked extensively on SQL and UNIX shell scripting.
- Involved in developing packages for implementing business logic through procedures and functions.
- Extensively involved in application tuning, SQL tuning using Explain Plan.
- Creating high level and low level functional and Technicalspecification document for application development.
- Project life cycle - from analysis to production implementation, with emphasis on identifying the source and source data validation, developing particular logic and transformation as per the requirement and loading the data into different target tables.
- Dimensional data modeling using Data Modeling, Star Join Schema/Snowflake modeling, fact and dimensions tables, physical and logicaldatamodeling.
- Used Teradata Utilities BTEQ, Fast Load, Multi Load utilities for loading bulk data.
- Performance of the queries is enhanced by executing optimization techniques such as index creation, table partition and coding stored procedures.
- Used SQL tool TeradataSQLAssistant to run SQL queries and validate the data in warehouse.
- Performed load and integration tests on all programs created and applied version control using Harvest tool to ensure that programs are properly implemented in production.
Environment: Oracle 11g/10g/9i/8i, SQL, PL/SQ, UNIX, Shell scripting, Informatica Power Center 8.6, Teradata TD 12, UNIX Shell Scripts, Harvest and Control M.
Confidential, San Francisco
- Worked in collaboration with onsite/offshore teams.
- Developed database objects including Indexes, views, sequences, packages, compoundtriggers, functions and procedures to troubleshoot any of the database problems.
- Worked on analyzing, design, develop, test, troubleshoot, optimize, enhance and maintain data integration and EDI applications in various formats using a combination of SQL and datamappingtools.
- Worked on advanced concepts of PL/SQL object types, Collections and DynamicSQL.
- Widely used PL/SQL tables and BulkCollect for processing the load on tables and retrieving into oracle.
- Created users and granting them appropriate privileges and roles.
- Generated server side PL/SQL scripts for data manipulation and validation and materializedviews for remote instances.
- Transferred data using Export/Import Utilities.
- Prepared backup schedule which include offline, online and logical backups.
- Implemented Oracle packages using pipelined table functions.
- Implemented virtual private database for data security.
- Implemented referential integrity and business rules as per design document.
- Worked on analytical functions and autonomous transactions.
- Developed UNIX shell scripts with embedded SQL*Loader, expdp /impdp calls and PLSQL statements to extract data in the form of flat files for data loading and migration.
- Monitored the performance of the database generating AWR reports.
- Implemented UNIX shell scripts for data migration and batch processing.
- Implemented referential integrity constraints and business rules as per design document.
- Worked extensively UTL FILE utility for file operations.
- Worked on Oracle 10/11g/Exadata, including designing tables, stored procedures and functions, triggers and performance tuning.
- Participated in code reviews, maintained checklist and document preparation.
Environment: Windows NT/2000, UNIX, Oracle 11g/10g/9i/8i SQL, PL/SQL, Java, ERWIN 4.7.