- Around 4+ years of IT experience in all stages of Software Development Life Cycle (SDLC) Business/Data analysis, ETL Informatica Development, Data Modeling, Project Management, Data Modeling, Data Mapping, Build, Unit testing, System Integration and User Acceptance Testing.
- Experience in Information Technology with a strong background in Database development and strong ETL skills for Data warehousing using Informatica.
- Superior SQL skills and ability to write and interpret complex SQL statements and also skillful in SQL optimization and ETL debugging and performance tuning
- Have Business knowledge on Dynamic CRM for Electric and Gas customers, payroll and Insurance for production employees.
- Proficient in the Integration of various data sources with multiple relational databases like MS SQL Server, MYSQL, Oracle11g, DB2, XML files and Flat Files into the staging area, Flat Files into the Dynamics CRM, Data Warehouse and Data Mart.
- Experience in developing of on - line transactional processing (OLTP), operational data store (ODS) and decision support system (DSS) (e.g., Data Warehouse) databases.
- Strong familiarity with master data and metadata management and associated processes.
- Hands-on knowledge of enterprise repository tools, data modeling tools, data mapping, tools, data profiling tools, and data and information system life cycle methodologies.
- Implemented Change Data Capture (CDC) with Informatica Power Exchange.
- Proficient with many different types of RDBMS, such as MySQL, MS SQL, PostgreSQL and NoSQL like MongoDB and Cassandra.
- Experienced in Oracle, MySQL, MS SQL, DB2 databases design, PL/SQL application development and back-end development using DDL, DML commands, SQL Queries, Table Partitioning, Collections, Import and Export data, Stored Procedure, Cursor, Functions and Triggers and Dynamic SQL using PL/SQL with advanced PL/SQL techniques.
- Good experience in writing UNIX shell scripts, SQL scripts for development, automation of ETL process, error handling and auditing purposes.
- Strong experience in Business and Data Analysis, Data Profiling, Data Migration, Data Conversion, Data Quality, Data Integration and Configuration Management.
- Expertise in all areas of software development methodologies (Waterfall and Agile) including client interaction, requirements gathering, analysis and tele-conferencing with the client during the progress of the project .
- Made use of all stages of Data Aggregation, Sorts, Merges, Joins, Change Captures, Peek stages in Data Loading jobs.
- Used Data Stage Manager to import/export Data Stage projects and jobs to define table definition in repository.
- Used Data Stage Director to debug, validate, schedule, run and monitor Data Stage Jobs
- Experience in designing Job Batches and Job Sequences for scheduling server and parallel jobs using DataStage Director, UNIX scripts.
- Dimensional data modeling using Data Modeling, Star Join Schema/Snowflake modeling, fact and dimensions tables, physical and logical data modeling.
- Involved in Performance Fine Tuning of ETL DataStage programs.
- Involved in Logical and Physical Design, Backup, Restore, Data Integration and Data Transformation Service.
- Functional knowledge on data models, database design development, data mining and segmentation techniques.
- Good knowledge of Java Object Oriented Programming concepts (OOPS).
- Hands on experience with Core Java with Multithreading, Concurrency, Exception Handling, File handling, IO, Generics and Java collections.
- Have delivered large-scale solutions, coordinated projects with both Onshore and Offshore teams.
- Involved in daily Scrum meetings to keep track of the ongoing project status and issues.
- Used GIT for version control and regularly pushed the code to GitHub and Bitbucket.
- Ability to work independently as well as work with teams having varying backgrounds on complex issues and have strong verbal and written communication skills.
- Independent, enthusiastic team player with strong adaptability to new technologies.
- Excellent communication & interpersonal skills with proven abilities in resolving complex software issues.
- Strong Knowledge in AWS (Amazon Web Services), S3 Bucket and Redshift (AWS Relational Database).
- Strong Knowledge of Hadoop Ecosystem ( HDFS, HBase, Scala, Hive, Pig, Flume, NoSQL etc) and Data modelling in Hadoop environment.
ETL: Informatica PowerCenter 10.0.1, 9.5.1, 9.0, 8.1.1,SAP Data Services 4.2
Data Profiling Tools: Informatica IDQ 10.0,9.5.1, 8.6.1
ETL Scheduling Tools: Autosys, Tivoli, Control M, ESP.
RDBMS: DB2, Oracle 11g/12c, SQL Server 2008/2012, MySQL, PostgreSQL
Data Modeling: ER (OPLTP) and Dimensional (Star, Snowflake Schema);
Data Modeling Tools: Erwin 9.3/7.5
Scripting: UNIX Shell scripting
Reporting Tools: Tableau 9, Cognos 8x/9x
Operating Systems: Windows XP/2000/9x/NT, UNIX
Source Management: Bit Bucket, GIT
Programming Languages: C, C++, PL/SQL, Python, HTML, CSS, Java
Other Tools: Notepad++, Toad, SQL Navigator, JIRA, Rally
Roles & Responsibilities:
- Worked on the requirements with Business Analyst and business users also involved in working with data modelers.
- Worked closely with data population developers, multiple business units and a data solutions engineer to identify key information for implementing the Data warehouses.
- Populated the Staging tables in MS SQL database with various Sources like Flat files (Fixed Width and Delimited), My SQL, etc.,
- Monitoring of current data flow from previous client’s database via Informatica Cloud to new Dynamic CRM Sales force portal. Routinely resolving data conflicts via ETL error log files including potential matches.
- Worked with Power Exchange for Microsoft Dynamics CRM (10.1.1) to connect to Dynamics CRM as a target.
- Have Knowledge about the Dynamic CRM and the business analysis.
- Implementation experience of CDC (Change Data Capture) using stored procedures, triggers and using informatica power exchange.
- Used SQL to extract and analyze customer data from different modules like accounts, premise, sites etc. to identify key metrics transforming raw data into meaningful insights.
- Used Informatica power center as an ETL tool to create source/target definitions, mappings and sessions to extract, transform and load data into staging tables from various sources.
- Parsed high-level design spec to simple ETL coding and mapping standards.
- Participated in system analysis and data modeling, which included creating tables, views, indexes, synonyms, triggers, functions, procedures and packages.
- Created mappings, mapping configuration tasks and task flows with Intelligent Informatica Cloud (IICS) and Informatica Power Center (10.1.1 & 9.6.1).
- Created and used the Normalizer Transformation to normalize the flat files in the source data.
- Extensively built mappings with SCD1, SCD2 implementations as per requirement of the project to load Dimension and Fact tables.
- Created & Implemented MS SQL Database Components, Databases, Tables, views, Constraints, Stored procedures, Views and Functions.
- Generated T-SQL scripts for Data retrieval, Manipulation & Validation for various applications.
- Involved in Performance Optimization of Queries & Stored Procedures by analyzing Query Plans, blocking queries, Identifying missing indexes etc.
- Developed ETL procedures to ensure conformity, compliance with standards and lack of redundancy, translating business rules and functionality requirements into ETL procedures .
- Used Evaluate expression options to validate and fix the code using Debugger tool while testing Informatica code.
- Provide production support and solve complex integration issues.
- Created mappings using various Transformations such as Source qualifier, Aggregator, Expression, lookup, Router, Filter, Rank, Look up, Sequence Generator, java and Update Strategy.
- Worked in Production Support Environment as well as QA/TEST environments using Quality Center tool for projects, work orders, maintenance requests, bug fixes, enhancements, data changes.
- Developed UNIX Shell scripts to automate repetitive database processes and maintained shell scripts for data conversion.
- Used TFS, VSTS and GIT for version control.
- Work closely with the teams to ensure architectural integrity related to integrations activities.
- Collaborate with business stakeholders for UAT and Quality Assurance for unit/technical testing.
- Performed Unit testing and created unit test plan of the code developed and involved in System testing and Integration testing as well. Coordinated with the testers and helped in the process of integration testing.
- Worked on performance tuning at both the Informatica level and Database as well by finding the bottlenecks.
- Provide leadership and technical guidance within the integrations and specifically Informatica landscape.
- Work alongside team members to architect, design and develop quality deliverables.
- Troubleshooting, diagnostics & performance tuning in database cost-based optimizer mode.
- Solved Different Severity Tickets based on SLAs for data issues raised by Customers using trouble ticket system.
Confidential, Burbank, CA
Roles & Responsibilities:
- Develop complex mappings by efficiently using various transformations, Mapplets, Mapping Parameters/Variables, Mapp let Parameters in Designer. The mappings involved extensive use of Aggregator, Filter, Router, Expression, Joiner, Union, Normalizer, Sequence generator, SQL and Web Service transformations
- Demonstrated the ETL process Design with Business Analyst and Data Warehousing Architect.
- Assisted in building the ETL source to Target specification documents .
- Effectively communicate with Business Users and Stakeholders.
- Work ed on SQL coding for overriding for generated SQL query in Informatica.
- Involve in Unit testing for the validity of the data from different data sources .
- Design and develop PL/SQL packages, stored procedure, tables, views, indexes and functions.
- Wrote Complex SQL queries for Data manipulation, insertion, deletion and updates.
- Reviewed application Code to verify the accuracy, standards and optimizing the queries and minimizing the page upload times.
- Experience dealing with partitioned tables and automating the process of partition drop and create in MySQL and MS SQL database s .
- Perform data validation in the target tables using complex SQLs to make sure all the modules are integrated correctly.
- Perform Data Conversion/Data migration using Informatica PowerCenter.
- Involve in performance tuning for better data migration process .
- Create UNIX shell scripts for Informatica pre/post session operations.
- Document and present the production/support documents for the components developed, when handing-over the application to the production support team.
- Developed Unix scripts for processing Flat files.
- Scheduled the jobs in the Autosys . Good knowledge about other automation tools.
- Prepared Test Data and loaded it for Testing, Error handling and Analysis .
- Prepared the test cases and tested the ETL components for end to end process.
- Documented ETL test plans, test cases, test scripts, test procedures, assumptions, and validations based on design specifications for Unit Testing, Systems Testing, expected results .
- Created an Issue Log to identify the errors and used it for preventing any such errors in future development works.
- Worked on the production code fixes and data fixes.
- Responsible to troubleshoot the problems by monitoring all the Sessions that are scheduled, completed, running and used Debugger for complex problem troubleshooting.
- Provided knowledge transfer to the end users and created extensive documentation on the design, development, implementation, daily loads and process flow of the Mappings.
- Worked with Application support team in the deployment of the code to UAT and Production environments .
- Involved in production support working with various mitigation ticket created while the users working to retrieve the database.
- Worked on deploying metadata manager like extensive metadata connectors for data integration visibility, advanced search and browse of metadata catalog, data lineage and visibility into data objects, rules, transformations and reference data.
- Project based on Agile SDLC methodology with 2 weeks of software product release to the business users .
- Used IDQ to profile the project source data, define or confirm the definition of the metadata, cleanse and accuracy check the project data, check for duplicate or redundant records, and provide information on how to proceed with ETL processes.
- Used materialized views to create snapshots of history of main tables and for reporting purpose .
Roles & Responsibilities:
- Developed Source to Target Mappings using Informatica PowerCenter Designer to load data from various source systems like ORACLE, MySQL, S QL Server and Flat files to the target .
- Involved in using SQL Loader to bulk load data in Oracle database.
- Communicated with business customers to discuss the issues and requirements.
- Effectively worked in Informatica version based environment and used deployment groups to migrate the objects.
- Reviewed and analyzed functional requirements, mapping documents, problem solving and trouble shooting.
- Involved in fixing invalid Mappings, testing of Stored Procedures and Functions, Testing of Informatica Sessions, Workflow and the Target Data.
- Tuned performance of Informatica session for large data files by partitioning and changing appropriate properties like increasing block size, data cache size, sequence buffer length and target based commit interval.
- Developed Shell Scripts, PL/SQL stored procedures, table and Index creation scripts.
- Production Support has been done to resolve the ongoing issues and troubleshoot the problems..
- Created unit test plans and test scenarios. Extensively used the Informatica Debugger for debugging the mappings associated with failed sessions.
- Involved in preparing detailed ETL design documents.
- Involved in unit, Integration, functional and performance Testing
- Create deployment groups and perform migration of objects from DEV to QA .
- Responsible for creating data mapping, designing documents and unit test documents.
- Experience in writing queries using ORACLE. MySQL and Microsoft SQL Server.
- Experience in fixing defects after code migration.
- Write Shell script for running workflows in UNIX environment.
- Performed unit testing at various levels of the ETL and actively involved in team code reviews.
- Performance tuning on sources, targets mappings and SQL (Optimization) tuning.
- Effectively used Informatica parameter files for defining mapping variables, workflow variables, FTP connections and relational connections.
- Used debugger in identifying bugs in existing mappings by analyzing data flow, evaluating transformations .
- Executing the project both in waterfall and agile methodologies.
- Effectively worked on Onsite and Offshore work model .
- Participated in daily status meetings and conducting internal and external reviews as well as formal walk through among various teams and documenting the proceedings.