Technical Lead Resume
SUMMARY
- 14+ years of diversified experience in the field of Information Technology and Data warehousing / Data Hub using Informatica PowerCenter. Developing Strategies for Extraction, Transformation and Loading (ETL)/(ELT) mechanism using Informatica.
- Extensive experience in complete Software Development Life Cycle (SDLC) including System Requirements Analysis, Design, Development, Testing, Maintenance and Enhancement in variety of technological platforms with special emphasis on Data Warehouse and Business Intelligence applications.
- Complete understanding of Software Development methodologies like Agile, waterfall etc.
- Strong in Data warehousing concepts, dimensional Star Schema and Snowflake Schema methodologies.
- Assessed, Architected and implemented end to end Data warehouse solution with focus on Data Quality, Profiling and Data Security which contributed to enhanced decision making.
- Hands on Experience in developing workflows, Session, mapplet, Workflow monitor to create schedule and control workflows, tasks, and session using Informatica Power Center 10.2.1, 10.1.1/9.1.1/7. X and data stage 8.X/7. X.
- Expertise in implementing complex Business rules by creating complex mappings / mapplets, shortcuts, reusable transformations and Partitioning Sessions.
- Extensively worked on Informatica PowerCenter Transformations such as Source Qualifier, Lookup, Filter, Expression, Router, Normalizer, Joiner, Update Strategy, Rank, Aggregator, Stored Procedure, Sorter and Sequence Generator.
- Having strong Knowledge and experience in OLTP and OLAP Process.
- Hands on Experience in Performance Tuning and resolving bottlenecks issue in various levels like sources, targets, mappings and sessions.
- Extensively provided end to end 24/7 Production Support by monitoring data into a complex high - volume environment using Quick Base ticket logging system.
- Expertise in OLTP/OLAP System Study, Analysis and E-R modeling, developing Database Schemas like Star Schema and Snowflake Schema used in relational, dimensional and multi-dimensional modeling.
- As part of OAT (Operational Acceptance Testing), moving processing Prod data to different lower level UAT environments (Other Production like Environments) as requested by the clients.
- Extensively worked on L1 & L2 production support related issues and analyses issue priority and address to L3 team with details and follow up on items.
- Responsible for creating conference calls for groups to discuss incidents that are production related, time or scheduled related incidents.
- Using Informatica read data from various data sources and legacy systems into Teradata database using BTEQ, FASTEXPORT, MULTI LOAD, and FASTLOAD.
- Hands on Experience in integration of various data sources like Teradata, DB2, oracle, SQL server, flat files and VSAM file.
- Expertise in developing SQL and PL/SQL codes through various procedures, functions , Trigger, packages, index and collections to implement the business logic.
- Involved in Unit testing, System testing to check whether the data loads into target are accurate.
- Hands on Experienced in UNIX and BTEQ scripts for automation of ETL process and Informatica pre & post session operations.
- Hands on Experience in Informatica Data replication 9.1.1 (IDR) and Informatica data quality (IDQ)
- Ability to work in under Stress, Strong communication, presentation and analytical skills, interpersonal skill, innovation in decision making skills.
- Experience in working with production support team and Change management and Incident Management and Deployment Management teams.
- Ability to work independently or in a team environment or in a rapid pace environment.
- Possess strong interpersonal skills, communication and presentation skills with the ability to interact with people at all levels.
- Hands on Experienced in Test data management (TDM).
- Currently training in Snowflake.
TECHNICAL SKILLS
ETL Tools: Informatica 10.2.1, 10.1.1, 9.6, Informatica Data replication 9.1.1 (IDR) and Informatica data quality (IDQ), IBM Data stage Version DataStage 8.1/8.0/7.5.1 , TDM, SSIS
Data Modelling Tools: Erwin 4.2.2, Oracle SQL Data Modeler.
Platform/Database Tools: Teradata 14 /13/12, Oracle 9i/10G, SQL server, IBM DB2
Programming Languages / Tools: SQL, PL/SQL, shell script, BETQ Script, Teradata Sql Assistance, Toad 9.0.1
Reporting Tool: Business Object, Cognos
Methodologies: E-R Modelling, Star schema, Snowflake schema
Scheduling tools: Control -M, WLM, Informatica Scheduling
Service Management Tools: Service NOW
PROFESSIONAL EXPERIENCE
Confidential
Technical Lead
Responsibilities:
- Inception (Business scoping), Requirement gathering & analysis with business and which leads to Story creation with Functional Impact analysis.
- Worked with the team to define strategy, goals and scope leading to product Roadmap and delivery and convert BRDs into Technical requirements.
- Offshore-Coordination, Business co-ordination, mentoring a team, Strong Negotiating skills with various applications.
- Serves as a leader and motivator on cross-functional teams during the lifecycle of development.
- Co-ordinate to develop schedules for all the testing Cycles for Integration, SIT and UAT testing.
- Participate in CAB meetings to discuss release scope and/or roadblocks and Negotiate, plan and manage all release activities.
- Co-ordination with Cross functional team, Project status tracking, Co-ordination with the Business partners, BRDs, Business flow, Reviewing technical and functional design.
- Building a future roadmap for product’s quality by building automation framework.
- Risk mitigation plan - Continued (24x7) maintenance and support activities/Bugfixes/Warranty Support/cutover plans.
- Used various Transformations like Joiner, Aggregator, Expression, Filter, Normalizer, Update Strategy and Lookup.
- Involved in creating mapping design, Code change, validating and testing the mapping and fixing the defects.
- Used Debugger to test the data flow and fix the mappings.
- Lead Performance Testing, Test Dry Run, Mock Conversion testing for Product releases.
- Automate the balancing process to avoid manual intervention in month end process.
- Worked with Variables and Parameters in the mappings to pass the values between sessions.
- Identify Mapping bottlenecks and tuned mappings, session’s parallel pipeline partitioning for the better performance and applied pushdown optimization technique to achieve optimal performance.
- Create BETQ script to perform $n load process.
- Created Post UNIX scripts to perform operations like gun zip, file transfer, remove and touch files.
- Involve in migration the object from SIT environment to UAT environment.
- Implemented parallelism in loads by partitioning workflows using Pipeline, Round-Robin, Hash, Key Range and Pass-through partitions
- Implemented monthly audit process for the Claims subject area to ensure Data warehouse is matching with the source systems for critical reporting metrics.
- Involved in preparation of UAT test cases and UAT test result.
- Involved in create change request for Production Implementation Process.
Environment: Informatica Power Center 10.2.1, Teradata, Teradata SQL Assistant, MS UNIX, BTEQ Script, Putty, SQL Developer, Flat files.
Confidential
Team Lead
Responsibilities:
- Loading Source Data to Landing Area using Informatica Data replication9.1.1 (IDR)
- Designed and Developed different mappings using Informatica 9.1.1.
- Responsible for Extracting and transforming data from different source systems and loading to staging area using Informatica.
- Responsible for the development of business logic using Informatica and Oracle.
- Responsible for Unit and Integration Testing for the same.
- Analyze the requirements provided to us, Estimate the task and plan accordingly. Depending upon which we prepare the ETL specifications documents which will be reviewed and finalized by client depending upon which build process starts as per the timelines.
- Identified and tracked slowly Changing Dimensions and facts according to Business logic.
- Used various Transformations like Joiner, Aggregator, Expression, Filter, Update Strategy and Lookup.
- Identify Mapping bottlenecks and tuned mappings, session’s parallel pipeline partitioning for the better performance and applied pushdown optimization technique to achieve optimal performance.
- After the code is deployed in production, it will be monitored until it is stabilized in production as per criteria specified.
- Used Informatica IDE and IDQ to profile and cleanse data to assess the Data Quality of Historical- transactional data and cleansed it using IDQ .
- Used Address and Phone Number scrubbing transformations from IDQ to clean data using the profiling analysis from IDE.
- Data standardization, verification, matching and data quality on the incoming source data.
- Have worked on IDQ for Reference table, Profiling and Address Doctor.
Environment: Informatica Power Center 9.1.1, Informatic Data Replication (9.1.1), Informatica Data Quality (IDQ ), Oracle11, MS UNIX, BTEQ Script, Putty, SQL Developer, Flat files.
Confidential
Team Lead
Responsibilities:
- Analyzing the transform request raised and performs all the pre-files checks and parameters checks required for its execution.
- Updating the parameters if required for execution of run request.
- Run, Monitoring the jobs and fixing the defect which comes because of missing files, incorrect value for parameters, incorrect paths.
- Timely Execution of Transform Request with Zero defects
- Checking files and data in UNIX box.
- Assisting team members in development work.
- Having calls with senior team members to resolve certain major issues.
- Taking work on priority.
- Sending all transform execution reports on time and taking care that there is no miss.
Environment: Data Stage 8.1, Teradata, SQL Assistant, MS UNIX, BTEQ Script, Putty, SQL Developer, Flat files.
Confidential
Team Lead
Responsibilities:
- Understanding the business functionality & Analysis of business requirements.
- Converted requirement specifications in to detailed design and implementation.
- Design and developed ETL jobs using various transformation jobs using Designer.
- Extracting data from sources like oracle, flat files and transforming them using business logic and loading the data to the Data warehouse.
- Worked extensively on different types of transformation like Aggregator, Join, lookup, sort, filter, expression, router, update strategy Transformer.
- Used Mapping Parameters, Mapping Variables, and Workflow Variables in order to implement the business logic
- Created and maintained Informatica jobs to perform ETL process in Designer.
- Checks the system health before scheduled starts.
Environment: Data stage 8.1, Oracle10g, SQL Assistant, MS UNIX, Putty, SQL Developer, Flat files.
Confidential
Technical Lead
Responsibilities:
- Leading a team of 10 developers.
- Understanding the Business requirements based on Functional specification to design the ETL methodology in technical specifications
- Responsible for developing, support and maintenance for the ETL (Extract, Transform and Load) processes using Informatica Power Center 8.5.
- Experience in integration of heterogeneous data sources like Oracle, DB2, SQL Server and Flat Files (Fixed & delimited) into Staging Area
- Developed extraction mappings to load data from Source systems to ODS to Data Warehouse.
- Used Change Data Capture (CDC) to simplify ETL in data warehouse applications.
- Worked on various kinds of transformations like Expression, Aggregator, Stored Procedure, Lookup, Filter, Joiner, Rank, Router and Update Strategy.
- Implemented performance tuning of Sources, Targets, Mappings and Sessions by identifying bottlenecks and used Debugger to debug the complex mappings and fix them.
- Copied/Exported/Imported the mappings/sessions/ worklets /workflows from development to Test Repository and promoted to Production.
- Created UNIX shell scripts for Informatica ETL tool to automate sessions. Used Debugger wizard to remove bottlenecks at source level, transformation level, and target level for the optimum usage of sources, transformations and target loads.
- Perform unit testing at various levels of the ETL.
Environment: Informatica Power Center 8.1, Oracle 10g, Toad, UNIX, Notepad+ +, WinSCP, Putty, Flat files.