We provide IT Staff Augmentation Services!

Technical Architect Resume Profile

3.00/5 (Submit Your Rating)

Professional Summary

  • 10 years of IT experience in Analysis, Design, Development and Implementation of Enterprise Data Warehousing, Data and Dimensional Modeling and having knowledge on Big Data.
  • IBM Certified Solution Developer - Info Sphere DataStage v8.5.
  • More than 6 year experience on Onsite/offshore model.
  • Wide Experience in implementing Data Warehousing solutions to large corporate clients.
  • Strong understanding of the principles of Enterprise DataWare housing and Dimension Modeling.
  • Strong experience in working with Business Analyst to understand business requirements from business owners in order to transform business requirements into effective technology solutions.
  • Worked end to end design of Dataware housing solutions which include creation of mapping documents using information analyzer, performing Data modeling using ERWin, Building Audit and error handling and error reprocessing models.
  • Extensively worked with Data stage Designer to develop processes for extracting, cleansing, transforming, integrating, and loading data into data warehouse database.
  • Having experience in writing server routines and using advanced stages like XML Pack, saleforce, java transformer,MQ and WSDL.
  • Having good knowledge in different scheduler like Zena and Control-M, SVN version controlling and IBM server manager for job migration between environments.
  • Experience in creating shell scripts for automation and auditing.
  • Good understanding of IBM Master Data Management data model and worked as technical lead in integrating multiple source systems to MDM and extracting MDM data to create EDW/EDM party, account, PartytoAccount relationship dimensions.
  • Performed multiple Proof Of Concepts on Big Data where I have used HDFS, MapReduce,Hive,Pig and HBase.
  • Having Experience in Application development using Java,J2EE.
  • Excellent interpersonal skills and communication skills, proven team player with an analytical bent to problem solving and delivering under high stress environment with resource constraints
  • Got Strong functional abilities, flexible and goal oriented, good communication and interpersonal skills with self-motivation and quick learning at the work.
  • I have worked on projects as a Team Member, Technical Lead and as an Architect, with hands on experience on all phases of the software development life cycle.

Areas/Applications

Operating System

Linux, Sun Solaris 8/7.0, IBM AIX 4.3/5.1

Tools

DataStage 8.7/8.5, IBM Master Data management, IBM CDD, IBM Data Studio, Information Analyzer, Control M, Zena scheduler, SVN, ERWin

Databases

Oracle 9i, 10g, 11g. DB2 And SQL Server.

Languages

SQL,PL/SQL, Unix Shell Scripting, Java, BASIC

Professional Experience

Project : One Customer - MDM Integration

Title

Confidential

Client Name

Confidential

Work Location

Confidential

Period

Confidential

Position

Technical Architect

Brief Description

  • Part of One customer HNB is Implementing MDM Master Data management around customer and account data. MDM creates golden copy or master record of party/Account information that is coming from different source systems. The Main goal of this project is to integrate 27 source systems data to MDM.

  • HNB wants to implement EIM Enterprise Information Management which involve meeting Huntington's demands for information using current environment and Modernize the information management platform via a strategic initiative called one point/One Customer.

  • Using IBM Datastage to Populate MDM with customer and account information using XML.

  • Maintain Party and account dimension models with SCD-2 and SCD-1 operations.

  • Perform Name and Address standardization using Quality stages.

  • Compose XML's based on MDM XSD for addParty, maintainParty, addAccount and maintainAccount using datastage XML stages.

  • Run MDM batch process to invoke MDM services which take XML request and create XML responses with success or error message.

  • Reprocess MDM rejects based on response error code.

Responsibilities

  • Create and maintain a multi-year vision to drive the maturity of enterprise data assets.

  • Build and maintain the Reference Architectures / Target State Architectures and ensure linkages to other architecture models.

  • Understand, analyze and see through the requirements and propose effective method of changes that are efficient, performance oriented and technologically possible.

  • Design the implementation solution for requirements as per the proposed method of changes and guide a team of junior associates in the same.

  • Preparing Technical Design Documents and performing Data and Dimensional Modeling using ERWin.

  • Manage and drive large cross departmental efforts and delivery teams involving ETL, MDM, Data Warehouse, Java, SOA developers

  • Implemented Audit and Error handling and reprocessing process.

  • Developed complex nature jobs, Common routines, Shell scripts and patterns that are need for development.

  • Tune the code with performance and consistency with the requirements as the main factors of consideration.

  • Coordinated with the customer to understand any new requirement and changes and get it delivered.

  • Applied many Process improvement plan to standardize the process in various phases of SDLC like Code Review Document, Technical Design Review Document, Data Validation sheet and UTD.

  • Undertake, supervise and conduct brain storming sessions among the team members to expose them to the latest technological traits and identify the applicability of such traits within the project.

Environment

Datastage V 8.7, IBM Data Studio, SQL studio, winscp, putty, ZENA, IBM Mainframe DB2 and DB2 LUW, Oracle ,Sql Server.

Title

Confidential

Client Name

Confidential

Work Location

Confidential

Period

Confidential

Position

Technical Lead

Brief Description

  • HNB wants to implement EIM Enterprise Information Management which involve meeting Huntington's demands for information using current environment and Modernize the information management platform via a strategic initiative called OnePoint. This strategic initiative having 3 main objectives.

  • Create a common repository or data called EDW Enterprise Data Warehouse .

  • Data Governance/Data Quality implement MDM Master Data management around customer and account data. Implement active data quality and remediation program.

  • Information Delivery Provide a self-services reporting and analytical environment.

  • This project Involves ETL using IBM datastage V8.7 to Extract data from different systems like loans, withdrawal, IE deposit, Demand Draft etc , Transform as per EDW data model and load into MDM and EDW/EDM DB2 Database .

Responsibilities

  • Understand, analyze and see through the requirements and propose effective method of changes that are efficient, performance oriented and technologically possible.

  • Design the implementation solution for requirements as per the proposed method of changes and guide a team of junior associates in the same.

  • Preparing Technical Design Documents and performing Data and Dimensional Modeling using ERWin.

  • Implemented Audit and Error handling and reprocessing process.

  • Worked as Tech Lead and lead 8 member team.

  • Developed complex nature jobs, Common routines, Shell scripts and patterns that are need for development.

  • Tune the code with performance and consistency with the requirements as the main factors of consideration.

  • Coordinated with the customer to understand any new requirement and changes and get it delivered.

  • Applied many Process improvement plan to standardize the process in various phases of SDLC like Code Review Document, Technical Design Review Document, DataValidation sheet and UTD.

  • Undertake, supervise and conduct brain storming sessions among the team members to expose them to the latest technological traits and identify the applicability of such traits within the project.

  • Help the Testing Teams to understand the requirements. Supervise the error fixes and providing the correct code to them for retest in case any errors are reported.

Environment

Datastage V 8.7, IBM Data Studio, SQL studio, winscp, putty, ZENA, IBM Mainframe DB2 and DB2 LUW, Oracle ,Sql Server.

Title

Confidential

Client Name

Confidential

Work Location

Confidential

Period

Confidential

Position

System Analyst

The Project

  • One of the key objectives of this POC is to effectively leverage customer based analytics to drive/influence decisions and customer experience. In short Customer Consolidation Platform CCP will be A platform of customer intelligence, Which enables the analytical and operational demands of next generation Customer . CCP will identify customers uniquely across all channels and capture analytical information on their transactions and interactions.

  • This project uses Hadoop Pig,Hive and Mapreduce to Extract data from different systems, Transform as per model and load into HBASE tables which are used to fetch different Analytical report needs of Marketing, Sales, CSD and Adhoc reporting.

Carried out the following activities :

Collect Transactions and Profile Data from Different system. Prepare Data for Loading to Hadoop in Required Model.

Used Sqoop to extract data from source system and load back to analytics data model. For non RDBMS data extraction used HDFS for extracting.

Transform Data to load in HIVE Model. Perform Joins, aggregations in HIVE between Profile Data and Transaction Data.

Finally Loaded to Hbase staging tables, Used SQOOP to load data from HIVE tables to HBASE Staging tables.

Hardware

8 Node cluster.

Operating System

Linux

Software

Hadoop distribution , Pig,Hive, Sqoop and HBASE.

Database

IBM Mainframe DB2 and DB2 LUW.

Client Name

Confidential

Work Location

Confidential

Period

Confidential

Position

Sr ETL Developer

The Project

  • MTS is a large telecommunications carrier. MTS provides long-distance services and broadband data globally. This project includes developing Data warehouse from different data feeds and other operational data sources.

  • Built a central Dataware house where data comes from different sources like oracle, SQL server and flat files. Actively involved as an Analyst for preparing design documents and interacted with the data modelers to understand the data model and design the ETL logic.

Responsibilities

  • Worked with business analyst to identify, develop business requirements, transform it into technical requirements and responsible for deliverables.

  • Provide the staging solutions for Data Validation and Cleansing with Datastage ETL jobs.

  • Performed data conversion from DB2, Oracle and Flat Files.

  • Used the Datastage Designer to develop processes for extracting, transforming, integrating, and loading data into Enterprise Data Warehouse.

  • Used Parallel Extender for Parallel Processing for improving performance when extracting the data from the sources.

  • Used various Parallel Extender partitioning and collecting methods.

  • Extensively worked with Job sequences using Job Activity, Email Notification, Sequencer, Wait for File activities to control and execute the Datastage Parallel jobs.

  • Created re-usable components using Parallel Shared containers.

  • Defined Stage variables for data validations and data filtering process.

  • Tuned Datastage jobs for better performance by creating Datastage Hashed files for staging the data and lookups. Used Datastage Director for running the Jobs.

  • Written Data stage routines in order to achieve business logic.

  • Extensively written shell scripts in different scenarios.

  • Implemented Debugging Methodologies with Break Point Options.

  • Designed and implemented slowly changing dimensions methodologies.

  • Written data stage routines for data validations.

  • Written batch Job Controls for automation of Execution of data Stage Jobs.

  • Also involved in Work plan preparation for the team members and assigning ETL jobs development to the team members. Responsible for Peer Testing team before the delivery of the ETL objects.

  • Worked with Cognos report developers to gather the reports requirement and populated the data accordingly through which load and complexity in fetching the reports is minimized.

  • Actively worked during nights and weekends during go-live and post production.

Environment

Datastage V 8.7, IBM Data Studio, SQL studio, winscp, putty, ZENA, IBM Mainframe DB2 and DB2 LUW, Oracle ,Sql Server.

We'd love your feedback!