Sr Informatica Developer Resume
Minnetonka, MN
SUMMARY:
- Informatica Developer with over 8 Years of experience of Data warehousing using Informatica Power Center 9.x/8.x/7.x/6.2/5.1/4.7
- Skilled in creating ETL Schedules using Maestro tool and Involved in SQL performance tuning.
- Well versed in working on Informatica Power Center tool - Source Analyzer, Data warehousing designer, Mapping Designer, Mapplet Designer, and Transformation Developer.
- Experience working in Big Data, Hadoop ecosystems and hive i.e. for loading the data from HDFS into the HIVE tables.
- Extensively optimizing the performance of existing and new mappings using various performance tuning techniques.
- Experience working with Informatica Repository and Repository Manager in Creating User Groups, Users, Folders, Versioning and Assigning permissions and privileges
- Experience Working as a Business Analyst/Data analyst i.e. interfaced with business users to prepare and update Business Process Requirements (BPR) based on the analysis.
- Expertise in creating sessions, workflows and worklets to run with the logic embedded in the mappings using Power center Designer.
- Responsible for mapping and transforming existing feeds into the new data structures and standards utilizing Router, Lookups (Connected, Unconnected), Expression, Aggregator, Update strategy & stored procedure transformations.
TECHNICAL SKILLS:
Data Warehousing: Informatica Power Center 9.x/ 8.6/7.1.2/6.2/5.1/4.7 , Informatica Power mart 6.2/5.1/4.7 (Repository Manager, Designer, Server Manager, Workflow Manager, Workflow Monitor),Informatica PowerConnect,Power Exchange, ETL, OLAP, OLTP. Hadoop, Hive.
Data Modeling: Dimensional Data Modeling (Star Schema, Snow-Flake, FACT, Dimensions Tables, Physical and Logical Data Modeling, ER Diagrams, Erwin 4.5/4.0/3.5, Oracle Designer 2000.
Business Intelligence: Business Intelligence experience in Business Objects 6.5/6.0/5.1,Web-Intelligence 2.5, Business Objects SDK, Cognos Series 8.0/7.1, Crystal Reports
Programming: SQL, PL/SQL, ANSI SQL,Unix Shell Scripting, toad 7.x, SQL*Loader
Databases: Oracle 9i/8i, IBM DB2 8.0/7.0, MS SQL Server 6.5/7.0.2000 , Teradata V2R5/V2R4, MS Access 2000, Developer 2000.
Environment: Win 95/98/2000/XP, Win NT 4.0, UNIX, Sun Solaris, MS DOS and OS/390
PROFESSIONAL EXPERIENCE:
Confidential, Minnetonka, MN
Sr Informatica developer
Responsibilities:
- Worked with Data Architects to gather requirements and developed the mappings based on the s2t requirements.
- Worked on the ETL Metadata i.e. for assigning Confidential Person ID to existing IFB members in Health Rules.Load Health rules data i.e. csv files with the latest version data of member into the MDM and TIBCO then MDM Job to process HR Extract into MDM and assign Confidential Person ID.
- Coordinated with cross-functional teams in different locations for quality data and based on the Mapping Specifications.
- Experience in creating User Groups, Users, Folders, Versioning and Assigning permissions and privileges and deploying the migration objects across environments.
- Designed mappings using various transformations like router, source qualifier, and application source qualifier, Expression, filter, union, aggregator, update strategy, rank, Look Up / other transformations.
- Worked on flatfile, delimited flatfiles based on business requirement. Created technical design documents.
- Involved in Unit testing, Pair testing with the QA team in Development, Integration testing.
- Participated in Scrum, Design and Code review meetings. Create deployment group, labels, Verify and validate the mappings, sessions, parameter files and all after migration across all environments.
- Create deployment group, labels, Verify and validate the mappings, sessions, parameter files and all after migration across all environments.
- Extensively optimized the performance of existing and new mappings using various performance tuning techniques.
- Worked on the Pushdown Optimization techniques as per the requirement.
- Worked on the Enhancement, existing and New Projects.
- Worked with QA team in case of any defects raised and resolving those defects.
- Extensively used Autosys Scheduling tool to Schedule Informatica jobs.
- Worked in Production Support and resolved issues on time and send emails alerts, communication to the team.
Environment: Informatica 9.x, Oracle,SQL, PL/SQL, Quality center, Windows XP/2000/NT, UNIX.
Confidential, Minnetonka, MN
Sr Informatica developer
Responsibilities:
- Involved in understanding of business processes and coordinated with business analysts to get specific user requirements in Agile Team and SCRUM meeting.
- Coordinated with cross-functional teams in different locations for quality data and based on the Mapping Specifications, developed the mappings based on the requirements.
- Designed complex mappings using various transformations like router, source qualifier, and application source qualifier, Expression, filter, union, normalizer, aggregator, update strategy, rank, Look Up / other transformations .
- Involved in Unit testing, Integration testing, and code reviews.
- Worked on Hadoop Environment i.e. getting the files from HDFS location into the Hive tables, created tables and loaded the data into the Hive based on the requirements and did analysis based on that.
- Worked on Quality Center for defect tracking, analyzing and repairing the open defects.
- Involved in test case designs, test scenarios identification, test case execution.
- Extensively optimized the performance of existing and new mappings using various performance tuning techniques.
- Worked with QA team in case of any defects raised and resolving those defects.
- Extensively used Redwood to schedule the Informatica jobs in different environments using job definitions and job chains.
- Worked in Production Support and resolved issues on time and send emails alerts, communication to the team.
Environment: Informatica 9.x,Hadoop Concepts, SQL,PL/SQL, Quality center, Redwood Scheduling tool, Windows XP/2000/NT,UNIX.
Confidential, St Paul, MN
Sr Informatica developer
Responsibilities:
- Responsible for Business Analysis and Requirements Gathering.
- Plan, execute, and manage the integration of new applications into existing systems throughout the enterprise
- Worked with Business Analysts and data stewards to identify source and Confidential data systems.
- Coordinating with cross-functional teams in different locations for quality data.
- Prepared high level design, technical specification, source- Confidential mapping documents by gathering Business requirements.
- Designed complex mappings using various transformations like router, source qualifier, and application source qualifier, Expression, filter, union, normalizer, aggregator, update strategy, rank, Look Up and other transformations.
- Created mapplets for reusable code and used them in various mappings.
- Involved in Unit testing, Integration testing, and code reviews
- Worked on Quality Center for defect tracking, analyzing and repairing the open defects. Involved in test case designs, test scenarios identification, test case execution.
- Extensively optimized the performance of existing and new mappings using various performance tuning techniques.
- Worked with QA team to design test plan and test cases for User Acceptance Testing (UAT)
- Extensively used Tivoli to schedule the Informatica jobs in different environments.
Environment: Informatica 9.x, SQL, PL/SQL, Quality center, Windows XP/2000/NT, UNIX.
Confidential, Minneapolis, MN
Business/ Data Analyst/UAT
Responsibilities:
- Responsible for quality and timeliness of operations.
- Interfaced with business users to prepare and update Business Process Requirements (BPR) .
- Created test cases and test scripts.
- Coordinating with cross-functional teams in different locations for quality data and analysis
- Evaluate the given problem, situation and crisis; investigate for solutions using data analysis practices and prepare analysis reports.
- Created production / analysis report and ensured all artifacts complied with corporate SDLC Policies / guidelines.
- Prioritized outstanding defects and system problems, ensuring accuracy and deadlines were met.
- Documented various data processes. Designed test plans and defined cases for functional, integration system, and user acceptance testing.
- Attended weekly defect report meetings and presented progress updates.
- Participated in requirement and design meetings and hosted Test case review meetings.
- Involved in test case designs, test scenarios identification, test case execution.
- Reported bugs and interacted with developers to resolve the problems.
- Involved in Failed test case analysis and provided analyzed data logs to developers Do some analysis on the Business requirements/rules. •
- Based on the business requirements and rules will prepare test scenarios/test cases.
- Analyzed application, system, and security errors.
- Escalated issues to developers and verified fixes.
- Worked on Quality Center to create the test scenarios/defects in Test Plan / execute the Test cases in Test lab whenever it is need and will raise the defects / assign to the appropriate team in case of any issues/Risks.
- Updated the Learning documents specific to the Subject Areas. Interfaced with SMEs to prepare BPR documents for ongoing projects
- Validated technical designs created by IT developers against functional specifications.
- Worked with QA team to design test plan and test cases for User Acceptance Testing (UAT).
- Coordinated work plans between project manager and client using MS Project
- Extensively used the Autosys Scheduler to schedule the Data stage jobs in different environments.
Environment: Teradata SQL Assistant 12.0, AQT, SAS, SQL Navigator, DB2, Quality center, Windows XP/2000/NT, Linux, UNIX.
Confidential, Minneapolis, MN
Informatica Technical Lead/ETL Support
Responsibilities:
- Prepared high level design, technical specification, source- Confidential mapping documents by gathering business requirements.
- Created worklets and workflows that load data from and to Teradata using Teradata utilities like MLoad, FLoad, FastExport and TPump.
- Designed complex mappings using various transformations like router, source qualifier, and application source qualifier, Expression, filter, union, normalizer, aggregator, update strategy, rank, Look Up / other transformations.
- Prepared the Bteq scripts, FExport and Mload scripts depends on the requirements.
- Created mappings that load and read from flat files and loads the data into the external databases.
- Created mapplets for reusable code and used them in various mappings.
- Extensively used parameter files for every workflow to define mapping and session parameters
- Worked on Quality Center for defect tracking, analyzing and repairing the open defects.
- Extensively optimized the performance of existing and new mappings using various performance tuning techniques.
- Extensively used the Redwood Scheduler to schedule the Informatica, Bteq, Mload, Fexport jobs in different environments.
Environment: Informatica 9.0/8.6.1, Teradata SQL Assistant 12.0,Bteq Scripts Teradata Utilities, Maestro Tivoli, Oracle 9i/10, Visio, SAP, HP-UNIX, SQL Navigator, SQL Server 2005.
2000
Confidential, Portland, OR
Informatica Developer/Lead/ETL Support
Responsibilities:
- Prepared high level design, technical specification, source- Confidential mapping documents by gathering Business requirements.
- Worked on determining the best alternative that would optimize the performance and time among the available alternatives.
- Extensively worked on loading and extracting data from SAP using BAPI transformation.
- Created worklets and workflows that load data from and to Teradata using Teradata utilities like MLoad, FLoad, FastExport and TPump.
- Designed complex mappings using various transformations like router, source qualifier and application source qualifier, Expression, filter, union, normalizer, aggregator, update strategy, rank, Look Up / other transformations.
- Developed different tasks like command, control, email and decision in various workflows depending on the requirements.
- Worked as the Informatica release manager besides developing the code, responsible for creating labels, applying the labels and creating the queries.
- Created mappings that load and read from flat files, XML files and Oracle tables.
- Worked on loading data to info objects using power exchange.
- Created mapplets for reusable code and used them in various mappings.
- Created FTP scripts to send the flat files and XML files to respective servers.
- Extensively used parameter files for every workflow to define mapping and session parameters
- Worked on Quality Center for defect tracking, analyzing and repairing the open defects.
- Extensively optimized the performance of existing and new mappings using various performance tuning techniques.
- Worked closely with Informatica Corp. in resolving issues.
- Scheduled Informatica batch jobs using Tivoli Maestro scheduling tool both event-based and time-based to run every 15 minutes during the business hours.
Environment: Informatica 8.6/7.1.2, Oracle 8i/9i, UNIX, SQL, Toad 8.0/9.0,Maestro Tivoli, Exam diff,Winscp,Code Promotion Application, WinSql, SQL Server and Windows XP
Confidential, Minneapolis, MN
Informatica Developer/ETL Support
Environment: Informatica Power Center 7.1.2, Oracle 9.x, UNIX, SQL, PL/SQL, TOAD 8.0/9.0, Windows XP, UNIX Exam diff,Winscp,Code Promotion Application, WinSql, SQL Server and Windows XP
Responsibilities:
- Involved in meetings with Business System Analysts to understand the functionality and prepared ETL Technical Specifications
- Designed mappings according to the mapping standards, naming standards of Confidential document for future application development
- Involved in analyzing the business requirements for the interface development from Staging Data Store (SDS) layer using Views to MODEL-N Application.
- Extracted data from SDS layer, transformed data as per the business requirements and loaded it into XML files
- Extensively used Informatica 7.1.2 to load data from source tables and then loaded the data into Confidential table i.e. to oracle, flat files, xml files, JMS Queue and sql server.
- Involved in the development of Informatica mappings and also tuned for better performance
- Most of transformations were used like the Source Qualifier, Aggregator, Expression, Lookup, Router, Normalizer, Filter, Update Strategy and Joiner transformations, xml transformations and JMS
- Developed full load interfaces for one-time initial load and incremental load interfaces using oracle change data capture methodology for ongoing load. Developed shell scripts to batch process Informatica workflows using pmcmd and sftp the csv flat files
- Developed shell scripts to batch process Informatica workflows to create xml files and put xml files into a JMS queue
- Developed shell scripts to maintain archive of both records in database tables and flat files for a 4 day time period
- Scheduled Informatica batch jobs using Tivoli Maestro scheduling tool both event-based and time-based to run every 15 minutes during the business hours.
- Developed migration plan for production move and migrated code from test to production
- Created several complex mappings as per business requirements.
- Created reusable transformations and Mapplets and used them to reduce the development time and complexity of mappings and better maintenance
- Involved In the development of CDC process.
- Used Workflow manager for Creating, Validating, Testing and running the sequential and concurrent Batches and Sessions
- Created ETL Schedules using Maestro tool and Involved in SQL performance tuning
- Involved in developing UNIX shell scripts and extensively used PL/SQL and SQL.
- Tuned the SQL queries and PL/SQL code
- Involved in Unit testing, Integration testing, and code reviews
- Interacted with Informatica Admin team to migrate from Dev to all environments( Systems, QA, Preprod and Production)
- Involved in Cycle runs and fixing the Issues in all the environments
- Involved in production support and expertise in trouble shooting the application
- Involved in recovery of critical job failures in production environment. Coordinated with DBA’s, external clients and business users in the process of recovery
- Expertise in fixing the critical data issues in production
Confidential, Atlanta, GA
ETL Informatica Developer
Responsibilities:
- Responsible for Business Analysis and Requirements Gathering.
- Translated requirements into business rules and made recommendations for Application Warehouse Development.
- Involved in designing the documentations and provided Solution, given Recommendations for the Data Transport Facilities i.e. a Method for migrating the batch objects from source system to the Confidential system.
- Involved in the designing the High-Level Design Documents, Conceptual Design Documents and Architectural Diagram of the ETL Process for the projects according to the business requirements.
- Involved in the designing the documents for the requirements or changes required in the infrastructure required to develop / support the solutions and provided data replication technology solution recommendations.
- Involved in designing the documents for the Informatica’s - Current mode of operations for the development, production, testing and disaster recovery environments.
- Used Informatica to extract and transform the data from existing, new and modified sources and finally loaded into the Staging area, DataMart, Confidential databases and Datawarehouse depending upon the requirements.
- Connect: Direct and Informatica Power center will be used for data transfer and ETL batch processing.
- Informatica will be used for job scheduling and session/job level monitoring / Notification.
- An Enterprise Scheduling tool to be used for controlling / managing batch jobs / workflows and integrated information delivery extracts / reports Scheduling tool.
- Parsed high-level design spec to simple ETL coding and mapping standards.
- Performed technical / functional requirements analysis towards the ETL architecture (Methodology, Framework) and migration of artifacts through interactions with Technical & Business Liaisons.
- Maintained warehouse metadata, naming standards and warehouse standards for future application development.
- Created the design and technical specifications for the ETL process of the project.
- Used Informatica as an ETL tool to create source/ Confidential definitions, mappings and sessions.
- Designed / developed Star Schema, Snowflake Schema and created Fact Tables and Dimension Tables for the Warehouse and Data Marts and Business Intelligence Applications using Erwin Tools.
- Responsible for mapping and transforming existing feeds into the new data structures and standards utilizing Router, Lookups (Connected, Unconnected), Expression, Aggregator, Update strategy & stored procedure transformations.
- Worked on Informatica Power Center tool - Source Analyzer, Data warehousing designer, Mapping Designer, Mapplet Designer, and Transformation Developer.
- Implemented Slowly Changing dimension methodology for accessing the full history of accounts and transaction information.
- Designed use of mapplets in Informatica to promote reusability, eliminate coding redundancy and ease maintenance of version control.
- Created sessions, workflows and worklets to run with the logic embedded in the mappings using Power center Designer.
- Implemented performance tuning logic on targets, sources, mappings, sessions to provide maximum efficiency and performance.
- Provided the Informatica Error handling Techniques and Data quality solutions.
- Involved in resolving the data quality issues and provided recommendations for the Performance Improvement.
- Scheduled ETL’s in Enterprise scheduler Control-M and extensively used Business Objects for report generation.
Environment: Erwin 4.5,Informatica Power Center 7.1.2, Power Center Designer, workflow manager, workflow monitor, Control-M, Teradata Utilities, Oracle 9.x, SQL, PL/SQL, UNIX 5.9, Business Objects Xi, Windows XP, TOAD 8.