We provide IT Staff Augmentation Services!

Analyst Programmer Resume

2.00/5 (Submit Your Rating)

SUMMARY

  • 8+ Years of experience in IIS DataStage 11.3, IIS DataStage 9.1, IIS DataStage 8.5, DataStage 8.1 & Ascential DataStage 7.5 EE (PX) with various roles and responsibilities.
  • Excellent experience on IBM Information Analyzer.
  • Good work experience in UNIX Shell Scripting.
  • Good work experience in Teradata, DB2, Oracle and SQL.
  • Good Knowledge on Data Warehousing Concepts.
  • Have work experience on Control - M Scheduling tool, MKS and RTC.
  • Excellent Interpersonal Skills, Communication Skills, Technical and Management Skills, Strong Logical & Analytical Skills and a quick learner.
  • Expertise in Extraction, Transformation and Loading with various operational sources like Oracle and Flat Files into a Data Warehouse.
  • Knowledge on Tableau.

TECHNICAL SKILLS

ETL: IBM IIS DataStage 11.3, DataStage 9.1, IBM IIS DataStage 8.5, WebSphere DataStage 8.1, DataStage Version control, Data Stage SE 7.5

Languages: SQL, PL/SQL Programming, HTML and UNIX Shell Scripting

Database: Teradata, DB2, SQL, Oracle

Operating System: Unix, Linux

Scheduling Tools: Control - M

Others Tools: Information Analyzer, MKS and RTC

Learning’s: Tableau

PROFESSIONAL EXPERIENCE

Confidential

Analyst Programmer

Environment: DataStage 9.1, Teradata, UNIX Shell Scripting & Information Analyzer.

Responsibilities:

  • Involved in analyzing the data through Information Analyzer. Extract the data from Flat files and load into Staging tables and perform the business logic in Transformation and later load in Target Teradata tables.
  • Involved in Developing and testing the data. And providing KT to UAT team in explaining the business requirements.
  • Designed Parallel jobs using Sort, Join, Lookup, funnel, Transformer, Filter processing stages.
  • Created monthly data load extracts from all SBA loan servicing agencies for the Data Analytics team. The file sources were several thousands of CSV, Access, SQL, Flat files.
  • Wrote extensive backend SQL Scripts to scrub and load data files in to a DB2 Database for each months Analytics reporting services.
  • Designed ETL technical specs performed analysis.
  • Conducted live sessions for demonstrating prototypes, UAT labs and issue resolution.
  • Involved in Defect Analysis.
  • Involved in preparing a test cases document and writing test cases in Quality Centre(QC).

Confidential

Designing and Testing

Environment: DataStage 9.1, Teradata, UNIX Shell Scripting & Information Analyzer.

Responsibilities:

  • Producing high quality work product and deliverables on-time (zero roll-back, backout) within budget and allowable Delivery Excellence Metrics, adhering to Process.
  • Taking additional team responsibilities and flexibility for the benefit of the project.
  • Ensure timely and prompt reporting of risks / issues / Dependencies.
  • Proficient in developing strategies for Extraction, Transformation and Loading (ETL) mechanism.
  • Expert in designing Parallel jobs using various stages like Join, Merge, Lookup, Remove duplicates, Filter, Dataset, Funnel, Transformer.
  • Designing Reusable components (Parameter set, Job Sequencer) which can be used across the project.
  • Creating UNIX Shell Scripting for audit process.
  • Expert in working with DataStage Manager, Designer, and Director.
  • Involved in Performance tuning to improve the performance of the Data stage jobs.
  • Involved in Unit Testing, System Testing and UAT.
  • Involved in creating Data lineage report and Metadata Work bench.
  • Involved in deployment of code in UAT.
  • Involved in defect analysis.
  • Involved in analyzing the data through Information Analyzer.
  • Involved in creating Test case documents.
  • Involved in interacting with testing team for any correction of Data.
  • Involved in writing Test cases in QC.
  • Providing KT to the team in required area.

Confidential

Designing, Testing, Production Implementation and Maintenance

Environment: Data Stage 8.1, DB2, UNIX Shell Scripting & Control-M.

Responsibilities:

  • Responsible in gathering the requirements from Onsite team and Analyze the requirement.
  • Involved in creating the mapping sheet and design document.
  • Understand the technical specifications and develop DataStage jobs for Extraction Transformation, Cleansing and Loading process of DW.
  • Designing the Parallel jobs using various stages like Transformer, Lookup, Join, Filter, Funnel, Copy, Remove Duplicates, Sort, Sequential file
  • Designing Reusable components (Parameter set, Job Sequencer) which can be used across the project.
  • Used DataStage as an ETL tool to extract data from sources systems, loaded the data into the Teradata, Oracle, DB2 database.
  • Extensively involved in creating UNIX Shell Scripting for audit process.
  • Reviewing the code delivered by the team members and raising the defects in QC.
  • Designing, compiling, validating, running, monitoring, exporting and importing the Data Stage Jobs using the Client Components.
  • Involved in Performance tuning to improve the performance of the Data stage jobs.
  • Involved in Unit Testing, System Testing and UAT.
  • Involved in scheduling the jobs using Control-M tool.
  • Involved in deployment of code in production.
  • Involved in defect analysis.
  • Extensively involved in Validation of data.
  • Involved in interacting with source team for any correction of Data in files.
  • Involved in writing Test cases in QC.
  • Involved in Production support.

Confidential

Designing and Testing

Environment: Data Stage 8.1, DB2, UNIX Shell Scripting & Control-M.

Responsibilities:

  • Analyzing the RSS document received from the user.
  • Design and development of Jobs
  • Mapping data items from source system to the target system.
  • Understand the technical specifications and develop DataStage jobs for Extraction Transformation, Cleansing and Loading process of DW.
  • Designing Reusable components (Parameter set, Job Sequencer) which can be used across the project.
  • Involved in Unit Testing, System Testing and UAT.
  • Extensively involved in creating UNIX Shell Scripting for audit process.
  • Designing, compiling, validating, running, monitoring, exporting and importing the Data Stage Jobs using the Client Components.
  • Involved in scheduling the jobs using Control-M tool.
  • Involved in deployment of code in production.
  • Extensively involved in Validation of data.
  • Involved in interacting with source team for any correction of Data in files.
  • Involved in writing Test cases in QC.
  • Involved in Production support.

We'd love your feedback!