We provide IT Staff Augmentation Services!

Technical Lead Resume

4.00/5 (Submit Your Rating)

NebraskA

PROFESSIONAL SUMMARY:

  • Around 9 years of experience in Application Design, Data Modeling, Development and Implementation of Data warehousing using Informatica, Oracle, Talend and SQL Server.
  • Expertise in OLTP/OLAP system study, Analysis and developing Database Schemas like star schema and snowflake schema used in relational, dimensional and multidimensional modeling.
  • Experience in Insurance, Agricultural and Technological industries with excellent leadership, analytical and verbal communication skills.
  • Experienced in the use of Agile approaches, using Extreme Programming, Test - Driven Development, Unit Testing, Code Reviews and Scrum.
  • Expertise in advocating and implementing Lean UX and Minimum Viable Product (MVP) software development approaches for secure SDLC.
  • Good working knowledge with data quality, data profiling using Informatica Data Profiler and IDQ.
  • Training teammates in Informatica and providing feedback for personal and professional growth.
  • Experience in setting up CyberSecurity Standards across the enterprise and System Scoping for SOC2 Audit Readiness.
  • Good Experience in programming in PL/SQL with Triggers, Stored procedures, functions in Oracle database.
  • Designed and maintained several applications as an Application Architect working with both internal and external vendors.
  • Good experience in Installation and configuration of Informatica Power Center 6.x/7.x /8.x/9.x
  • Upgraded Informatica version from 6.x to 7.x and 7.x to 8.x and 8.x to 9.x.
  • Extensive knowledge of various Performance Tuning Techniques on Sources, Targets, Mappings and Workflows using Partitions/Parallelization and eliminating Cache Intensive Transformations.
  • Migrated Informatica code from Dev to QA and QA to Production environments.
  • Good exposure to Oracle GoldenGate Data Integration tool for real time Confidential .
  • Ensure appropriate privileges are assigned to the different users.
  • Hands on experience in defining Technical Requirements for ETL processes and developing Complex Mappings to load data into enterprise data warehouse and data marts.
  • Designed and developed ETL processes using Informatica PowerMart/PowerCenter with Oracle, SQL Server, Teradata, XML, MS Access and FLAT files.
  • Expertise in documenting the ETL process, Source to Target Mapping specifications, status reports and meeting minutes.
  • Experienced in Service Oriented Architecture (SOA), Enterprise Sequencing, Project Discovery, Requirement Gathering and analysis.
  • Good experience in data analysis, error handling, error remediation and downstream impact analysis.
  • Well versed with the business functioning, end user requirements gathering and developing process in accordance with the business rules.
  • Used Autosys, Tidal and Informatica Scheduler for scheduling jobs.
  • Induced Customer-First attitude and learn-from-your-user concepts into software development.

TECHNICAL SKILLS:

ETL Tools: Informatica PowerCenter 6.x/7.x /8.x/9.6.1, Power Exchange 9, Metadata Manager, Informatica Data Quality (IDQ), Talend Open Studio, SSIS

Databases: MS SQL Server 2000/08/12/14 , Oracle 9i/10g/11g, MS Access, MySQL, Teradata, Flat File systems, JSON Files

OS: Windows 98/2000/XP/Vista/Win 7/8.1, UNIX, MS- DOS

Languages: SQL, T-SQL, PL/SQL, C, C++, HTML, R, C#, UNIX shell scripts, XML

Design Tools: ERwin, SQL* Loader, Visio, EASparx

Database Tools: Toad 10.5, SQL Developer 2.1.1, WinSCP 4.2.8, Oracle GoldenGate 11g, SQL Server Management Studio 2012/14

Reporting Tools: OBIEE 10g, SFDC, Cognos Report Studio, SSRS

Scheduling Tools: Informatica Scheduler, Autosys, Tidal, Visual Studio, SSMS

EXPERIENCE:

Confidential, Nebraska

Technical Lead

Responsibilities:

  • Created new Vendor Management process to be in compliance with BCBSA mandate for CyberSecurity Standards.
  • Working with Health Network Services (HNS) to understand their Provider data needs and created one stop data reporting layer.
  • Collaborating with the team leadership, to identify the areas of improvement for the team, assign owners for each task and tracking progress.
  • Extracting the Provider and Payment information from datamarts into Flat files using Informatica workflows, required for setting up the VCard virtual payment process for providers.
  • Automating the JSON files extraction from semantic layer over the DataHubs to send Provider information to Centers for Medicare & Medicaid Services (CMS).
  • Designed and developed multiple Informatica ETL workflows for loading FlatFiles from external vendor into a staging Raw database, then migrated them to IDS layer and then onto DataHubs for downstream consumption through semantic layer.
  • Onboarding new developer by creating an onboarding checklist and walk through of existing processes and standards.
  • As an Application Architect, reviewed the business requirements, created the design, dataflow and reporting layers for business users.
  • Collaborated with Architects and Technical Leads across the enterprise in creating an Extract Framework for data extracts from IDS/Datamarts/DataHubs.
  • Working in a subgroup for replacing/migrating enterprise scheduler Tidal with a more stable solution.
  • Assisted in implementing Major Identifier Repository for isolating Sensitive information to protect member PHI and PII data.
  • Developed detailed process flow document for ETL workflows to follow the set ETL standards at enterprise level.
  • Introduced Talend Open Studio for data integration as part of enterprise technology innovation for working with Hadoop BigData.
  • Used Team Foundation Server to track the progress of various projects, remove any impediments and deploy the code to CI and Lab environment.
  • Attained understanding of R language and its usage as part of Machine Learning and data analysis for Business Analytics.
  • Worked on the Data Model and made the necessary updates to the table structures based on the Product Owner’s input.
  • Code reviewed the ETL process and validated the data with Test Engineer, Architect and Enterprise Data Governance.
  • As a Tech Lead, assisted in Data analysis and validation using Levenshtein distance algorithm for Medical Policy tool.

Confidential

Responsibilities:

  • Worked as a Cross-Functional developer in an Agile team that maintains the self-service website - Confidential .
  • Fixed several production issues and refactored ETL process for updated business requirements.
  • Scheduled several Adhoc workflows to update Member information as and when required.
  • Conducted daily standup meetings and ceremonies as part of Agile team norms.
  • Worked in an Agile environment and implemented Test-Driven Development, Continuous Integration.
  • Updated the process flow document for ETL workflows with the scheduling and production support information.
  • Maintained the ETL process failure information and submitted Change Control documentation for production fixes and updated the incorrect member information in Confidential .
  • Continuously sending the documentation to the Delivery and Service teams and keeping them updated about change controls and production support.
  • Introduced the concept of integrating ETL into Continuous Integration using Lab environment as part of Agile methodology.
  • Used Team Foundation Server to track the project progress and deploy the code to CI and Lab environment.
  • Worked with the DA to understand the Business Requirements and Business Rules and Design the ETL accordingly.
  • Helping the teammates to develop in areas where they are lagging behind, by providing feedback in certain areas like communication skills, documentation and leadership.
  • Code reviewed the ETL process and validated the data with Data analyst, DBA and the Product Owner.
  • Training team members in Informatica ETL development and guiding in workflow creation.
  • Improved the performance of the ETL workflow by using Bulk Loads and dropping and recreating Indexes on the target tables.
  • Improved Member experience of the site by using table Swap process to avoid the site downtime.
  • Fixed the data errors encountered by unit testing the ETL code in development environment and code reviewed with the ETL backup resource.
  • Facilitated meetings with several internal teams, documented the decisions made and followed up after the meetings.
  • Co-ordinated with the Systems Analyst to develop ETL job to generate Activation Letter flat file that is scheduled to run daily using Tidal Scheduler.
  • Used MS - Test Manager to upload, update and maintain the unit test cases.

Environment: Informatica 8.6.1/ 9.1.0 /9.6.1 (Power Center), Repository Manager, Talend Open Studio, SSIS, SQL Server 2010/12/14, MS-WORD, Excel, T-SQL, Visio, Tidal, MS-TFS, MS-Visual Studio 2008/10/12/13 , MS-MTM, JSON, XML Altova, PowerShell

Confidential, Nebraska

ETL Developer

Responsibilities:

  • Worked on various work orders to fix and find solutions to existing ETL code.
  • Worked with the Business users and the data/system analysts to understand the Business rules and Business requirements and updated the ETL code accordingly.
  • Conducted Informatica code review meetings and updated the ETL code after taking input from team members. Made sure that the code is in-line, with the set standards.
  • Automated the ETL jobs using Tidal scheduler and Informatica in-built scheduler.
  • Updated Informatica workflow schedules, set naming standards for ETL objects, purged the unused objects,
  • Written and executed ETL test cases as part of testing the Informatica code before migrating to Production.
  • Involved in debugging mappings to fix data transformation errors, also involved in tuning Sessions to improve performance.
  • Analyzed the source data and decided on appropriate extraction, transformation and loading strategy.
  • Interacted with the end users to gather requirements, plan, build, test, deploy, document and apply change control process.
  • As part of an Agile team, implemented Test Driven Development by working with the SA/DA/Tester before the actual ETL development has started.
  • Implemented Pair Programming, Code Reviews and Unit testing as part of Agile Methodology.
  • Worked with the Data modeler and finalized the data model based on the business requirements.
  • Executed Adhoc workflow runs to accommodate the business requests as part of emergency change requests.
  • Handled successful production code fixes with minimum/no impact to business.
  • Implemented Global Parameter File concept and standardized the code migration from Development to Testing and then to Production.

Environment: Informatica 8.6.1/ 9.1.0 (Power Center), Power Exchange 9, SSIS, Repository Manager, SQL Server 10, MS-WORD, Excel, T-SQL, Visio, Tidal, MS-TFS, MS-Visual Studio 2008, XML Altova, PowerShell, Lab Manager

Confidential

ETL Lead Developer

Responsibilities:

  • Worked with end users to understand the Business Requirements and Business Rules and Design the specifications for development.
  • Responsible for all ongoing Data Model Design decisions and database implementation Strategies.
  • Involved in Data modeling and design of data warehouse in star schema methodology with confirmed and granular dimensions and fact tables.
  • Worked with Senior Business Management to develop Business Requirements Documentation, define High level Data Flows and supplied Technical Solutions to specific Business Needs.
  • Analyzed business process workflows and assisted in the development of ETL procedures for moving data from source to target systems
  • Involved in analyzing the source data and decision making on appropriate extraction, transformation and loading strategy.
  • Configure database and ODBC connectivity to various source/target databases.
  • Extracted and Processed data from various sources like Flat Files and Relational Databases.
  • Responsible to interact with Informatica Administrators to create and maintain each PowerCenter instance.
  • Designed and developed Informatica Mappings and sessions based on business user requirements and business rules.
  • Improved the Informatica Mappings and Session performance by eliminating extensive cache usage transformations, modifying session properties and partitioning the mapping.
  • Created mappings to read from Flat files, RDBMS and to load into RDBMS tables
  • Assisted data modeler in creating physical and logical models.
  • Extensively used Source qualifier, Aggregator, Filter, Joiner, Expression, lookup, Router, Sequence Generator, Update Strategy, Union and SQL transformations to create mappings
  • Involved in creation of Reports using OBIEE 10g for various Subject Areas.
  • Involved in Preparation of Unit Test Plans and testing with the same.
  • Written and executed Data Validation scripts as a part of QA Testing.
  • Documented the complete ETL Process along with the Source-Target Matrix documentation and Performance Issues.

Confidential

Responsibilities:

  • Worked in an Onsite-Offshore model and assisted the offshore team for Development and Data Validation related issues.
  • Worked with Data Modeler and Business Analyst to understand the Business Requirements and Business Rules.
  • Implemented Agile Methodology for handling the ever-changing Business Requirements.
  • Configure database and ODBC connectivity to various source/target databases.
  • Extracted data from various sources (Flat files, SQLServer, MS-Access, Oracle Databases).
  • Migrated UNIX/Informatica code from Dev to QA and QA to Production environments.
  • Involved in Integration Requirements Document (Technical Requirements) reviews and validated the document against the functional requirements document.
  • Worked on creation of Macros for creating CSV files from MS-Access database to overcome the ODBC connectivity issue.
  • Involved in finalizing the loading procedures, defined dependencies among workflows and sessions
  • Improved the Informatica Mappings and Session performance by eliminating extensive cache usage transformations, modifying session properties and partitioning the mapping.
  • Improved the Informatica Session performance by introducing the concept of Initial and Incremental sessions, with Incremental workflows not caching the lookup tables.
  • Provided guidance to the Off-Shore team on Various ETL Performance Tuning and Data Validation Issues.
  • Designed workflows for various sessions with Decision, assignment task, Event-wait, Event-Raise tasks.
  • Used Informatica in-built scheduler for Job Scheduling.
  • Performance Tuning, SQL Query Optimization, partitioning, Parallelism was implemented to improve the through put.
  • Performed daily Data Refreshes from Dev to QA and Created Schema copies for parallel development and CRP/UAT Testing.
  • Worked closely with OBIEE developer and developed different Dashboards, Subject-Areas, Pages and Reports.
  • Involved in the KT sessions for implementing Subject-Area security, Report security, data, access and User Security in OBIEE.
  • Worked with the Off-shore team in preparing Test Cases and Validated the Test Cases prepared by them.
  • Involved in Data Validation Process for Various sources and provided appropriate remedies.
  • Worked directly with the end users and got approval for the Data Reports and Dashboards through CRP sessions.
  • Used SFDC for bug tracking in CRP sessions and during User Acceptance Testing (UAT).
  • Documented Release management process, up-gradation Process, Security set up, Issue log, ETL migration and Database changes.

Environment: Informatica 8.6.1(Power Center), Repository Manager, Oracle 10g/11g, OBIEE 10g, SFDC, MS-Access, MS-WORD, Excel, Visio, Toad 10.5, SQL Developer 2.1.1, WinSCP 4.2.8, SQL*Plus, Autosys.

Confidential

Senior Informatica Developer

Responsibilities:

  • Configure database and ODBC connectivity to various source/target databases.
  • Extracted and Processed data from various sources like Flat Files and Relational Databases.
  • Responsible to interact with Informatica Administrators to create and maintain each PowerCenter instance.
  • Worked with end users to understand the Business Requirements and Business Rules and Design the specifications accordingly.
  • Designed and developed Informatica Mappings and sessions based on business user requirements and business rules
  • Improved the Informatica Mappings and Session performance by eliminating extensive cache usage transformations, modifying session properties and partitioning the mapping.
  • Improved the Informatica Session performance by introducing the concept of Initial and Incremental sessions, with Incremental workflows not caching the lookup tables.
  • Involved in Optimizing the Performance by eliminating Target, Source, Mapping, and Session bottlenecks.
  • Worked with Oracle DBA for loading data into reference tables, using Oracle GoldenGate for near-real time Change Data Capture.
  • Created mappings to read from Flat files, RDBMS and to load into RDBMS tables
  • Involved in Data model reviews and validated the tables, columns and data types to meet the requirements.
  • Creating source and target table definitions using Informatica Designer.
  • Used Error Handling Mapplet to capture error data into PMERR tables for handling nulls, analysis and error remediation Process.
  • Documented the workflow dependencies and ordering for Autosys job scheduling process.
  • Used Autosys Job Scheduling software for batch Scheduling.
  • Interacted with the FA/SME to gather requirements, plan, build, test, deploy, document and apply business rules.
  • Assisted data modeler in creating physical and logical models.
  • Extensively used Source qualifier, Aggregator, Filter, Joiner, Expression, lookup, Router, Sequence Generator, Update Strategy, Union and SQL transformations to create mappings
  • Finalized the loading procedures, defined dependencies among workflows and sessions
  • Worked with FAs/SMEs on analyzing various legacy data issues, capturing the error data and the impact of error data on business.
  • Worked with the Oracle database Administrator for creating Reference/Mapping tables and loading reference/dummy data, adding/dropping constraints, indexes and columns.
  • Involved in Preparation of Unit Test Plans and testing with the same.
  • Worked with the Testers in writing Test cases and Validated Test Cases prepared by them.
  • Migrated Informatica code from Dev to QA and worked with QA team in fixing the data/code related bugs.
  • Documented the complete ETL Process along with the Source-Target Matrix documentation and Performance Issues

Environment: Informatica 8.6.1(Power Center), Repository Manager, Oracle 11g, MS-WORD, Excel, Visio, Toad 10.5, SQL Developer 2.1.1, WinSCP 4.2.8, Oracle GoldenGate 11g, SQL*Plus, Autosys.

Confidential

Systems Analyst/Informatica Developer

Responsibilities:

  • Responsible for maintaining and updating technical documentation for software and systems.
  • Assisted development and production teams by determining Application design flaws, thereby optimizing software development life cycle.
  • Liaised with clients and vendors to resolve day-to-day problems and make process changes as requested by the business users.
  • Participated in design analysis and pre- and post-installation reviews.
  • Responsibilities also include identifying the areas of process improvement and establishing effective and efficient processes.
  • Involved in data Extraction and Transformation from various source databases and flat files and Loaded into Data Warehouse using Informatica.
  • Created Source and Target definitions, Transformation, Mapplets and Mapping by using Designer.
  • Implemented various transformations like Expression, Lookup, Sequence generator, and Source Qualifier, Rank, Aggregator etc. to load the data from Staging area to Data Warehouse.
  • Involved in creating MultiLoad, FastLoad and TPump control scripts to load data into Teradata staging database.
  • Used Server manager to create, run and monitor Sessions.
  • Involved in debugging mappings to fix data transformation problems, also involved in tuning Sessions to improve performance.

Environment: Informatica 7.1.4/ 8.1.1 (power center/power mart), Teradata, SqlServer2000, Toad, XML, Oracle 9i/10g, HP-UX, Ms-Word, Excel.

We'd love your feedback!