We provide IT Staff Augmentation Services!

Technical Lead Resume

5.00/5 (Submit Your Rating)

Rhode, IslanD

SUMMARY

  • Having Nine years of IT experience in analysis, design, development, testing, and implementation and troubleshooting/debugging for production in teh area of Teradata, UNIX, Control - M, Oracle, and ETL tool Informatica Power Center.
  • Extensive experience in providing Business Intelligence solutions in Data Warehousing for Decision Support Systems and OLTP/OLAP application developmentin Windowsand Unix platforms.
  • Experience working in Oracle8i/9i/10g/11g/12c with database objects like tables, views, Materialized views, triggers, stored procedures, functions, packages, indexes.
  • Expertise in performance tuning teh complex queries in Oracle using multiple approaches. Analysing teh AWR reports, explain plan and execution plan.
  • Proficient in Database Programming and Development using SQL Server and good experience in creating SQL (DDL, DML, DCL), Triggers, Views, User Defined Functions and Complex Stored Procedures.
  • Exclusive knowledge in Identification of User requirements, System Design, writing Program specifications, Coding and implementation of teh Systems.
  • Extensively worked with Teradata Viewpoint, Teradata Changes for major releases for production. Extensive experience in Teradata utilities (SQL, Fast Load, Multi Load, Fast Export, Tpump, Teradata Parallel Transporter TPT etc). Strong knowledge on Teradata Architecture and its different components.
  • Having extensive experience in modifying and creating different Teradata objects which includes tables, views, Business Views, Temporary tables and TD indices.
  • Expertise in UNIX commands, setting up Firewall and writing shell scripts. Created Shell Script to send teh data to downstream systems using FTP / SFTP.
  • Expertise in scheduling teh jobs for automation in Control-M to avoid manual interruption.
  • Strong Knowledge of Data Warehouse Architecture and Designing Star Schema, Snow flake Schema, Slowly changing dimension, FACT and Dimensional Tables.
  • Hands on experience with Informatica ILM Workbench for data masking.
  • Expertise in performance tuning of Informatica code. Experience in creating ER diagrams, Data Flow Diagrams/Process Diagrams. Involved in complete system Software Development Life Cycle (SDLC) of Data warehousing and worked on project planning and effort estimation too.
  • Worked extensively with Informatica Workflow Manager (using tools such as Task Developer, Worklet and Workflow Designer, and Workflow Monitor) to build and run workflows.
  • Identified and fixed bottlenecks, and tuned teh complex Informatica mappings for better performance.
  • Extensive experience in Informatica Power Center to implement data marts which involve creating, debugging and executing mappings, sessions, tasks, and workflows and testing SQL queries.
  • Proficient in performance tuning of Informatica Mappings, Transformations, and Sessions experienced optimizing query performance.
  • Worked on ETL methodologies for supporting data extraction, transformations and loading processing
  • Extensive experience in handling ETL using Informatica 8.x/9.x over six years. Source and target databases DB2, Oracle, SQL Server and flat files (fixed width, delimited).
  • Worked on java/j2EE using struts MVC framework to making online screens as per teh project requirements. Have worked on web technologies like HTML, java scripts, CSS, JSP, Servlet for creating front end pages.
  • Good knowledge in splunk tool to analyse teh raw data which is coming from app or web different layers. Created DASHBOARDS and REPORTS for analysing teh production alerts.
  • Ability to prioritize multiple tasks if multiple priority tasks come in parallel.
  • Extensively worked on teh Root Cause Analysis for high severity issues and resolve them within teh given SLA.
  • Prepare test cases & perform Unit / Integration / System / Regression Testing.
  • Worked in Agile Development model, participated in daily stand-ups calls, providing everyday updates in project movement.
  • Excellent communication and interpersonal skills. Good experience of working in Offshore-Onshore model of project delivery.

TECHNICAL SKILLS

ETL Tools: Informatica Power Center 8.6, 9.1, 9.5/6.

Tools and Utilities: SPLUNK, HPSM, PVCS Version Manager, SVN Subversion, Virtual Source Safe, Putty, WSFTPro, Eclipse, Rational App Developer (RAD), SAS Enterprise Guide, Quick Build, configuration Management, GIT, GitHub, Jenkins, JIRA etc.

DB Tools: Teradata SQL Assistant 13/14, SQL and PL/SQL Developer, TOAD, SQL*Loader etc.

Databases: Oracle 12c/11g/10g/9i, Teradata 13/14/15, DB2, SQL Server.

Teradata: UtilitiesBTEQ, Fast Load, Multi load, TPump, TPT etc.

Languages: SQL, PL/SQL, Shell Scripting, JAVA/j2EE, SAS, Web Technologies.

Scripting Languages: UNIX Shell Scripts, JavaScript

Operating System: Unix-AIX, Linux, Windows XP/Vista/7/8/10.

Office Tools: Microsoft Office 2003/2007/2010/2013

Scheduling Tools: IBM Tivoli Work load scheduler, Control- M

Methodologies: SDLC, AGILE, Waterfall

PROFESSIONAL EXPERIENCE

Technical Lead

Confidential, Rhode Island

Responsibilities:-

  • Worked in all teh stages of teh SDLC like Business, Functional, Technical Requirements Gathering, Designing, Documenting, Developing and Testing.
  • Worked closely with teh Business to understand their needs and provided my inputs to address their needs and requirements.
  • Understands teh customer subject area and teh database level changes.
  • Development of design documents with steps to achieve teh requirements.
  • Development of Teradata TPT Scripts and UNIX shell scripts for periodic/incremental loads to implement teh requirements.
  • Created, Tested and debugged teh Stored Procedures, Functions, Packages, Cursors and triggers using PL/SQL developer.
  • Wrote complex SQL’s and used volatile tables for code modularity and to improve performance of teh query.
  • Expertise in writing scripts for Data Extraction, Transformation and Loading of data from legacy systems to target data warehouse using BTEQ, Fast Load, Multiload, Tpump.
  • Analyzed teh system for teh functionality required as per teh requirements and created System Requirement Specification document (Functional Requirement Document).
  • Once teh coding is completed as per teh design documents tan deploy teh code to lower environments like DEV, UAT & REG for integration testing and co-ordination with them.
  • Unit testing, QA migration and support, defect fixing, deployment setup including implementation plans, QA documents, RFC creation and for teh above work streams.
  • Created teh production support documents for hand over teh support to offshore team. Involved in Performance Tuning of SQL Queries, Sources, Targets and Sessions by identifying and rectifying performance bottlenecks.
  • Developed BTEQ scripts to populate tables in Teradata EDW.
  • Worked with IS (Oracle & Teradata DBA’s) teams to implement teh DB changes and Configuration Management team to deploy teh code in production.
  • Worked very closely with IS and middleware teams for applying patches/Maintenances on Application, DB & Web servers as its Tier 1 Application and needs down time at night only.
  • Worked with middleware teams to set up teh connections for both teh webservers so that load can be equally shared b/w them and passed to both application servers.
  • Worked with EITS teams for doing maintenances, involved all stake holders for step-by-step communications and responsible for sending teh CMN to all teh other parties for teh same.
  • Use HP Quality Center for defect tracking and reporting weekly risks / issues meetings with teh stakeholders.
  • Worked with Production Control team to schedule teh jobs in Control-M for automation.
  • Closely working with CCB/RFC team to get teh changes approved for deployments.
  • Co-coordinating with multiple EITS teams to get teh latest patches applied on Database, Servers and Firewall rules to be in place and doing end-to end validation after that.

Team Lead

Confidential, Rhode Island

Responsibilities:-

  • Worked closely with teh business/user groups to understand teh business process and gather requirements.
  • Preparing teh detailed plan for teh development of teh Business requirements and Enhancements and create teh business requirement specification documents.
  • Developing teh CASE diagrams, high and low level designs and modelling teh application.
  • Responsible for construction of teh program affected by Business requirements and enhancements.
  • Participated in data models (logical/physical) discussions and created both logical and physical data models.
  • Participated in agile iterative sessions to develop extended logical models and physical models.
  • Created mapping documents and ETL design documents.
  • Analysis and Troubleshooting teh Informatica failures in Mapping, workflow, sessions and long running sessions.
  • Involved in configuration management to migrate teh code and Informatica objects to different environments.
  • Created mappings/workflows in Informatica to populate teh data from source (OLTP) to target (OLAP) depending on teh requirement.
  • Created multiple jobs for reporting purpose from OLTP and send teh data to Business via FTP/SFTP.
  • Identified teh required dependencies between ETL processes and Triggers to schedule teh jobs to populate data on a scheduled basis.
  • Involved in different phases of Data Warehouse life cycle ranging from project planning, requirements gathering and analysis, offshore coordination, ETL design, ETL development and testing, Reporting, integration, regression and user acceptance testing, implementation, system documentation and support.
  • Optimized and Tuned SQL queries used in teh source qualifier of certain mappings to eliminate Full Table scan.
  • Created and Configured Workflows, Worklets and Sessions to transport teh data to target warehouse tables using Informatica Workflow Manager.
  • Extensively used Informatica Client tools - Source Analyzer, Warehouse Designer, Mapping Designer, Mapplets Designer, Informatica Repository Manager and Informatica Workflow Manager.
  • Performed extraction, transformation and loading of data from RDBMS tables, Flat Files, SQL into Teradata in accordance with requirements and specifications.
  • Extensively worked on teh Root Cause Analysis for high severity issues and resolve them within teh given SLA.
  • Carried teh performance tuning in teh ETL Process to reduce teh processing time.
  • Created several SQL queries and reports using teh above data mart for UAT and user reports tan moved them to production after getting sign-off.
  • Production/Release Support - escalation point for production issues.
  • Handled Production issues fixing data issues
  • Created Teradata BTEQ scripts to implement business requirements and submitted these scripts to Control-M scheduler for automation and Business reporting on day to day basis.
  • Involved in after implementation support, user training, fixing and logging teh defects in Team Track or HPSM for auditing purpose.
  • Provided Support for multiple Tier 1 & tier 2 Applications and worked with Application teams.
  • Worked on RxConnect and JDA application and handled multiple weekly planned maintenances to keep teh system up & running.
  • Trained teh team internally to handle teh system independently and co-ordinated with Onshore - offshore team for teh status of delivery.
  • Involved in unit testing and system testing and worked on Agile Methodology.

Computer Programmer

Confidential

Responsibilities:-

  • Check that all elements to be masked are already identified as part of teh HPSM Data Discovery phase
  • Any test cases that use teh PHI/PII data from production for validation would need to be updated. dis should be done in same way as we would follow after a production refresh.
  • Make ensure that refresh would be done before teh masking process runs, using teh existing refresh processes of teh HPSM application.
  • Prepare document of architecture and support processes and updating them in SharePoint. So that everyone in teh team can access it.
  • Using teh control - m as scheduler for teh Data Masking jobs.
  • It is teh responsibility of Application team to run teh jobs of interfacing upstream applications using test data and not production data.
  • All updates made to online database with masking rules should be mirrored to teh repository database. Hence no separate refresh or masking process needs to be executed for reporting database.
  • Developed various mappings using Mapping Designer and worked with Aggregator, Lookup (connected and unconnected), Filter, Router, Joiner, Source Qualifier, Expression, Stored Procedure, Sorter and Sequence Generator transformations.
  • Optimized and Tuned SQL queries used in teh source qualifier of certain mappings to eliminate Full Table scan.
  • Created and Configured Workflows, Worklets and Sessions to transport teh data to target warehouse tables using Informatica Workflow Manager.
  • Created Data Mapping Documents as part of teh Data migration projects.
  • Wrote complex SQL’s and used volatile tables for code modularity and to improve performance of teh query.

Quality Assurance Programmer

Confidential

Responsibilities:-

  • Involved in teh requirement gathering and requirement Analysis. Interacting with teh client on various forums to discuss teh status of teh project.
  • Unit testing, QA migration and support, defect fixing, deployment setup including implementation plans, QA document, RFC creation and for teh above work streams.
  • Documented teh architecture, hardware, software and local environment configuration, and support processes and updating them in SharePoint.
  • Providing impact analysis and testing recommendations to ensure continuous system availability when their are changes to teh infrastructure and/or interfaces.
  • Need to create new mappings as per teh business requirements in Informatica.
  • Created low level and high level documents for teh mappings created and involved in testing for both technical and functional.
  • Involved in Performance tuning for teh long running queries.
  • Worked on complex queries to map teh data as per teh requirements.
  • Capture Metadata, Capacity Plan and Performance metrics for all teh queries being created and define archival strategy and provide guidance for performance tuning. Fixing teh performance issues and working on query optimization.
  • Developed Informatica mappings and workflows with complex business logics to load teh data from Source to DB2 and extensively worked on validation queries.
  • Used to interact with third party teams like DBA and Migration teams for migrating teh code to QA and PROD.
  • Perform Unit, Functional, Integration and Performance testing their by ensuring data quality.
  • Backup teh team wherever required and make sure teh team is aware of teh complexities of teh stories and challenges teh team on requirements for teh estimation process.
  • Created updated and maintained ETL technical documentation.
  • Need to work on teh CR’s submitted by Business to enhance/modify teh existing functionality.
  • Have to do QA process on batches assigned with teh coding in SAS using SAS Enterprise Guide.

Confidential

Tools SQL Developer

Responsibilities:-

  • Communicated with teh clients for requirement and design understanding in details.
  • Understanding business requirements and converting them in to technical requirements by creating system requirements specification (SRS) documents.
  • We were responsible for creating teh high & low level diagrams using Unified Modelling Language (UML).
  • After understanding teh requirements, we were responsible to put them in a prototype using HTML and Web technologies, tan send to Client for approval for sign off.
  • Client reviews that and update teh team if their are any changes in teh existing functionality and we work as per teh updated requirements.
  • After getting sign-off from teh client, need to convert them online pages using struts framework (MVC) & Hibernate and Stored procedures should be written to support teh backed application.
  • We have to create multiple pages to enhance teh functionality as per teh requirements within defined SLA by teh client.
  • Providing impact analysis and testing recommendations to ensure continuous system availability when their are changes to teh infrastructure and/or interfaces.
  • Developing new methods to prevent system failures and ensure optimum system reliability.
  • Once development is completed tan we were responsible for doing teh unit testing of teh code developed by us and create teh test cases as per teh client needs.
  • Providing guidance to team members to make unit test case plans and fixed teh bugs during testing.
  • Once Unit Testing is completed we have to commit teh code and deploy that in UAT so that user can start their testing.
  • Need to work with QA Team and Business folks to help them fixing teh bugs/issues while testing teh application in UAT after Integrating with other components.
  • Using teh HPSM/Team Track for logging and tracking teh status of defects logged by team or business users.
  • Once Client provides sign-off for UAT tan we were doing teh deployment to implement teh changes in Production and monitoring for couple of weeks so that issues can be fixed on time without delay.

We'd love your feedback!