We provide IT Staff Augmentation Services!

Architect/technology Lead Resume

4.00/5 (Submit Your Rating)

SUMMARY

  • 9+ years of IT experience in the Architecture, System Analysis, Design, Development, Implementation, Testing and Production support of Database, Data warehousing using Data modelling, Data Extraction, Data Transformation, Data Loading, Data Analysis Data Integration and ETL activities.
  • Designed technical processes as a Team Lead by using internal modelling and working with analytical teams to gather requirements and create specifications.
  • Managed all aspects of the SDLC process - requirement analysis, time estimates, functional specification, design, development, testing, packaging, and support/maintenance.
  • Expertise to follow agile process in application development.
  • Strong experience of working on ETL strategies using Data Stage 11.5/11.3/8.9/8.5/8.1 , Info Sphere Information Server, Web Sphere, Ascential Data Stage and Informatica Power Point 9.6/ 9.1.
  • Expert in designing Parallel jobs using various stages like Join, Merge, Lookup, Remove duplicates, Filter, Dataset, Lookup file set, Complex flat file, ODBC, ABAP Extract, Modify, Aggregator, Transformer, Sort, Copy, Funnel, CDC and Change Apply.
  • Expert in working with Data Stage Manager, Designer, Administrator and Director.
  • Proficient in analysis, design, development of Database Management Systems using ORACLE 11g/10g/9i, MS-SQL, DB2, Netezza
  • Good experience in Data Modeling with expertise in creating Star & Snow-Flake Schemas, FACT and Dimensions Tables.
  • Extensively worked on Informatica ILM for Data Archival and Masking.
  • Have used Qlikview 11 for developing reports and dashboards.
  • Good knowledge in developing reports and dashboards in Qlikview.
  • Have experience on KPI's and in building and maintaining QVDs and QVWs.
  • Expertise in creating complicated reports including parameter based reports, graphical reports, formula based, well-formatted reports, drilling reports, analysis reports and data reports.
  • Have worked on Bug tracking and Project management tool JIRA.
  • Working experience on application life cycle /quality/test management tool HP-ALM.
  • Experienced in working with scheduling tool TIDAL and Control-M.
  • Experienced in Quality Assurance creating for Data Warehousing projects creating Test Plans, Test Objectives, Test Strategies, and Test Cases. Ensuring the Data in data warehouse meets the business requirements.
  • Proficient in unit testing, system integration testing, implementation and maintenance of databases jobs.
  • Coordinate and deliver application development related activities.
  • Lead a team of 10 data integration team members to achieve success in design, development and deployment activities.
  • Strong analytical and problem solving skills with ability to work within team and independently.
  • Experience with advanced data analytic, data transformation and management projects.
  • Well versed in written & oral communication and analytical & problem solving skills.

TECHNICAL SKILLS

Domain: Banking & Finance, Media & Entertainment, Energy, Retail, Manufacturing

ETL Tools: Data Stage 11.5/11.3/8.9/8.5/8.1 , Informatica Power Center 9.6.0/9.0.1

Reporting Tool: QlikView 11

TDM and ILM: Informatica ILM, Informatica TDM, Erwin8.2 Data ModelerScheduler: Tidal-Scheduler, Control-M, Autosys

Testing Tools: QTP (HP-ALM), JIRA

Databases: MSSQL, Oracle, DB2, Netezza

Language: SQL, PL/SQL, UNIX Korn Shell Scripting, C, C++

Concepts: Data Modeling, Data ware House, ETL, Data structures, Algorithms, RDBMS

Operating Systems: Linux, Unix, Windows 98/2000/XP/7/8

PROFESSIONAL EXPERIENCE

Confidential

Architect/Technology Lead

Responsibilities:

  • Business line requirement gathering for existing code, understanding their current business functionality.
  • Designing the whole ETL architecture design flow for the Organization.
  • Handling the team of 16 members.
  • Preparing Functional and Technical document and preparing mapping document.
  • Creating Automation Job/reusable jobs/Scripts for faster deliverables.
  • Carrying out deployment activities in SIT, QA and PROD environment by using IBM Websphere Datastage and IBM InfoSphere Information Server Manager
  • Preparing Test Plan, creating Test Scenarios, and Test cases, Unit test cases, Comparison testing.
  • Support to the QA / UAT team in the functional testing
  • Performing the Unit testing, regression testing and negative testing.
  • Use HP-ALM for the quality testing.
  • Performance tuning of DB query.
  • Coordinating with OFF-shore team.

Technologies, Tools and DB: IBM Datastage 11.5/11.3, DB2, JIRA, Confluence, Control-M, Cobol, Linux

Confidential

Technology Lead

Responsibilities:

  • Involved in requirement clarification and brainstorming with Client business line, Solution Architects and Client SMEs for implementing a reporting solution on Netezza Platform.
  • Conducting discussions with Data Architects and Business Line for designing Data stage ETL process to implement reporting modules.
  • Performing high-level design of ETL Process and documenting them for client review.
  • Coordinating with offshore developers for completing the development tasks.
  • Performed estimation for the project releases and submitted them for client reviews.
  • Assisting Project Manager in developing the Project plan and schedule for ETL Area.
  • Assisting the Client SMEs to translate the business requirements into ETL Technical Requirements and documenting them for design.
  • Supporting business line in UAT Testing and clarifications.

Technologies, Tools and DB: IBM Datastage 11.3, Netezza Mako, Oracle 10g, Tidal Scheduler

Confidential

Data Analyst and Technology Lead

Responsibilities:

  • Requirement gathering and analysis with Client.
  • Create archival rules for the Ex-Investor data for each business lines
  • Find the frequency of executing the data archiving jobs for each business process.
  • Create the data masking strategy for PII, Archived and restored data.
  • Ensuring zero impact on source and target system after implementation of Purging and Masking .
  • Implemented Data Model using Erwin 8.2
  • Purged Data from ODS using Informatica ILM and Masked Data from ADS using Informatica TDM.
  • Developed SQL queries for the manipulation of data.

Technologies, Tools and DB: Informatica ILM, Informatica TDM, Informatica 9.6.0/9.1, Erwin8.2 Data Modeler,DB2 Data Visualizer

Confidential, Irvine, CA

Data Analyst and SQL Lead

Responsibilities:

  • Work with customers in gathering business requirements for data migration.
  • Work across multiple functional projects to understand data usage and implications for data migration.
  • Created test case scenarios, and maintained defects in ETL graphs from Source to Target.
  • Performed the tests in both the SIT, QA and contingency/backup environments
  • Performed all aspects of verification, validation including functional, structural, regression, load and system testing
  • Experienced in writing test cases, test scripts, test plans and execution of test cases reporting and documenting the test results.
  • Analyzed the data and applied relevant transformations, verified that the source data types are correct.

Technologies, Tools and DB: MS SQL, Datastage 8.5

We'd love your feedback!