We provide IT Staff Augmentation Services!

Architect Resume

3.00 Rating

Plano, TX

9 years experience in Data Warehousing application development and architecture using IBM Information Server, Datastage, Teradata Datawarehouse. Ability to communicate effectively with both technical and business staff and clear understanding of Software Development Life Cycle. Has strong expertise with Data Warehousing and Business Intelligence tools for various applications. Ascential Datastage v8: Parallel Extender, Datastage Designer, Datastage Director, Datastage Manager and Administrator, DS Jobs IBM Information Server - Quality Stage, Information Analyzer, Metastage Teradata Utilities - BTEQ, FastLoad, Tpump, Fexport, Queryman. Other: Scripting, COBOL, Business Data Objects, As an Analyst experienced in all phases of Software Development Life Cycle such as Requirement Gathering, Analysis, Design, Construction, Testing, Deployment, Maintenance and Documentation. Expertise in building Operational Data Store (ODS), Data Marts, and Decision Support Systems (DSS) using Multidimensional Model, Star and Snowflake schema design. Excelled in the below Business Intelligence techniques Data Modeling and Mapping Efficient and Defect-free ETL Batch Development Master Data Management, Metadata Management Source to Target Mapping, Data Validations, Data QualityAdvanced Analytics, Dashboards, Scorecards, Standardized/Ad-hoc reporting Strong experience in UNIX environment, Shell Scripting, ETL scripting in context with application architecture, middle ware and Windows Batch Scripts, XML. Teradata 12.0, DB2, Oracle 9i / 10i, Proven track in all the Datastage projects such as ETL development, Application support and maintenance, Enterprise data conversion and migration, Performance optimization and tuning projects. Worked in the Incident and Problem Management and well versed in the best practices – ITIL Foundation, ATC Testing processes, etc., and worked in the Service management tools such as HP Service Desk, AR Remedy. Excelled in the Quality Control processes followed in all phases of the project which includes: Internal Audits, Checklists, Defect and Causal Analysis, Unit Metrics and SLA. Proactive in nature completing tasks and timely communication. Excellent interpersonal skills and a good team player.

Bachelor of Technology
Major: Electronics and Communication Engineering

BIPM – Metadata Management on BI Decision Systems Framework.,
BIPM – Masterdata Management Certified

Technical Skills

Data warehousing Tools:

IBM Datastage Parallel Extender v 7.5.1, 8.0, 8.5 , Designer, Datastage Director, Datastage Manager, Quality Stage, IBM Information Analyzer,

Teradata Utilities - BTEQ, FastLoad, Tpump, Fexport, Queryman

Databases and Tools:

Oracle 9i / 10g, PL/SQL, DB2, TOAD, Teradata 12.0

Automation systems – Control-M, Autosys, Netezza, ESP scheduler.

BI Concepts and Functions:

DW Architecture, ETL Design, Data Mapping, Data Quality Analysis, System Performance tuning, Operational Data Sources, Decision Support Systems, Dashboards, Ad-hoc Reporting.

Software Languages:

UNIX Shell Scripting, Visual Basic 6.0, COBOL, C

Service Management Tools:

HP Service Desk – Incident, Problem, Change, Configuration Management. BMC AR Remedy for Incident, Problem, Knowledge Management.

Operating Systems:

Windows XP, UNIX, AIX systems, IBM ZOS2 Mainframes, Tandem Nonstop.

Office Software:

Microsoft Office Suite 2007, MS Project, MS Visio


Confidential,Plano, TX Feb 2010 - Present
Enterprise Data Warehouse – ETL Lead/Architect

Confidential,is one of the America’s leading Retailers operating throughout United States and Puerto Rico. JC Penney has the largest Apparel and Home furnishing sites on the internet, jcp.com and the nation’s largest General Merchandise Catalog Business


  • Provide architecture solutions and design for the existing ETL applications based or IBM Datastage v 8.5 and Teradata Datawarehouse environment.
  • Work in liaison with the business teams and various other applications that interface with Enterprise Data Warehouse to provide ETL and Database solutions and implement the same
  • Publish “Systems Appreciation Document” for the Multi-dimensional Enterprise data warehouse application and its architecture which feeds the Decision Support Systems and other applications - Operational Data store, Datamarts, Info-vision
  • Ensure quality and timely customer service to the clients and fix their issues / answer the queries in timely manner.
  • Maintain prompt and qualitative communication to various stake holders in case of system failures, incident reports, enhancements, etc.,
  • Design and Develop fixes to the broken systems or to support business process changes in the organization.
  • Develop Strategies for implementing Data validations in ETL processing
  • Carry out the development and testing in the Datastage and Teradata jobs and deployment in production.
  • Work in Onsite-Offshore model of application support and take rotation in being on-call.
  • Work in liaison with the system support teams in resolving production issues such as Database tablespace, etc.,
  • Enforce SOX compliance and corporate standards in the ETL applications. Worked in the application team for converting the application standards to SOX compliant
  • Work in development for an enhancement request from business client in Datastage in the Operational Datastore application to accustom the changing needs of business.
  • Mentor and help the junior and peer staff in the team to achieve the common team goal.
  • Use Information Analyzer and Quality Stage to monitor the production jobs and carry out performance analysis
  • Analysis in the lines of below: Infra Structure Rationalization, Data Quality Assessments, Data Quality Monitoring.
  • Maintain the repository of the source systems mapping for the master data around the business dimensions including Customer, Vendor, Item, etc., Able to provide faster mapping solutions for proactive proposals.
  • Report to the management and business partners of any High and critical incidents that occurs and maintain the communication to the management, business clients and all the stake holders.

Specialized Framework used in the project: Extensively worked on BIPM – Master Data Management Framework from BIDS (BI Decision Support)

Environment : IBM Datastage –Director, Manager, Information Analyzer, UNIX, Teradata 12.0 , BMC Remedy – Incident Management Console, Problem Management Console, Remedy Knowledge Management, PVCS Dimensions for Change Control, IBM DB2, Mainframes, MS Visio, MS Office

Confidential,Minneapolis, MN June 2007 – Jan 2010
Target Inforetriever – ETL Developer / Data Quality Analyst

Confidential,U.S.-based company headquartered in Minneapolis, Minnesota. The first Target store opened in 1962. Target has developed several Reporting systems that are used by the Top Management and also by the business clients. I was involved in the Marketing and Business Intelligence wing – Info Retriever, a data warehousing application. This provides multi dimensional reporting based on the Sales, Inventory, Forecasts and Vendor / Merchant Analysis.

Responsibilities – ETL Architect

  • Provide consultation to the various stake-holders that includes – the business users of the application, various other users (interfacing applications) in regards to their requests.
  • Analyze, study, and lead application and data integration requirements to proactively design, plan, and deploy needed capabilities.
  • Manage the incoming work requests and convert them into billable projects and provide the architecture of the solution to be developed
  • Meet with the Business Clients to understand their needs and changes in the business routines.
  • Prepare the Business requirement Document with the in-scope and out-of-scope specifications and co-ordinate with the Business user Clients for their Sign-Off.
  • Communicate to the Offshore team and explain the requirements clearly and work with them to get the Effort Estimate and Target Delivery Date, Break-ups.
  • Prepare the High-level and Technical Design Documents of the Datastage Jobs, Mappings, Database Schema, Multi-Dimensional Mapping, Traceability Matrix.
  • Work with Offshore and Onsite team in the Development of the designed components. Review and resolve the conflicts, issues raised during the coding phase.
  • Work in partnership with system administrators and engineering groups to assist in the design changes to equipment. Determine the impact upon software deployments, operations, and supportability to ensure timely and uninterrupted service for the customer.
  • Developed BTEQ SQL scripts for loading/unloading data by applying the transformations as per business rules.
  • Implemented SCD Type2 Effective Date range as part of history in FASTLOAD and TPUMP.
  • Used several Datastage stages – FILTER, JOIN, TRANSFORMER, SORT, etc.,
  • Prepare and publish Status Reports, Installation plans, communication reports, etc., to the above stake-holders.
  • Maintain Sourcing and version controls of the ETL jobs in PVCS Dimensions.
  • Participate in hardware and software configuration management.
  • Undertake projects like – Solaris Upgrade, SAP Load Modules Conversion.


  • Worked in the Performance improvements of the reporting application by various options: Fine Tuning SQL Queries, Tuning Metadata, Improving the Index visibility for the Database Tables. This improved the user experience of the application widely.
  • Improved the performance of the ETL jobs by reducing Job Contentions, Table Locks, Data loaders, etc.,

Responsibilities – Data Analyst

  • As Data Analyst, responsible for the monitoring and assessment of the ETL applications Data consistency, Integrity and Quality
  • Part of the SOX Compliance team and objective to ensure that the ETL applications of the client is SOX complaint.
  • Work with the application team to make necessary changes to the existing Datastage Applications to ensure SOX Compliance.
  • As part of Data Quality Assessment, used the IBM Information Analyzer to inventory the data, identify integration points, remove data redundancies, and document disparities between applications
  • To ensure all transactions in the ETL applications are processed quickly and uniformly so that the application teams are able to track and respond to risk resulting from the business, vendors and clients.
  • As part of Data Quality Monitoring, used IBM Information Analyzer to better understand the business context of the Legacy data of the company’s existing application
  • Worked in a team to analyze this information and proposed data quality solution to cleanse the customer data and spot trends over the time increasing the confidence of the data.
  • Application Decommission Analysis and proposal – Technology sunset, identify key business process, suggest value addition to the new proposal

Environment : IBM Information Server v8.0 including Datastage, Quality Stage, Information Analyzer, UNIX and shell scripting, Teradata – BTEQ, FASTLOAD, TPUMP , Attuinity ODBC Driver, VB 6.0, COBOL, MS Office, MS Visio, HP Service Desk, MS Project, IBM Mainframes.


  • Received “Great Team Card” in recognition of my contribution towards the consultation given for the development of an automated reporting project.
  • Was part of the team which received “Best Team Ever” award for the development of the “Gift Registry – New Reporting Strategy” Project.

Confidential,Chennai, India July 2005 – June 2007

Established in 1968, Tata Consultancy Services has grown to its current position as the largest IT services firm in Asia based on its record of outstanding service, collaborative partnerships, innovation and corporate responsibility. Values of TCS are integrity, leading change, excellence, respect for the individual and fostering an environment of learning and sharing.

Analytical Data Warehouse Development May 2006 - June 2007
Senior Developer


  • Was a part of a high profile team for a Health Insurance Partner.
  • Worked in the data modeling in view of dimensional and multidimensional reporting
  • Major contribution towards the Data Warehouse Design in IBM DB2 and the Extract, Transform and Load (ETL) Processing through Ascential Datastage.
  • Performed the cost and effort analysis and prepared the estimates for the requirements to highest point of accuracy.
  • Worked with several other source teams to understand the format of the data and the transformation into application compatible data.
  • Designed and developed all the ETL jobs in the application using Ascential Datastage Designer Module.
  • Exposure to BI Concepts – Master Data Management, Metadata Modeling.
  • Extensively used the Designer module for the development of the project.
  • Formulated the data retrieval queries in a highly profitable way to achieve more performance of the application
  • Designed and developed the application architecture in UNIX with the database on IBM DB2
  • Used the Datastage Director for running and monitoring the batch jobs developed and also for debussing the code.

Environment: Ascential Datastage Designer, Datastage Director, SQL Server 2005, UNIX environment, IBM Mainframes


  • Received the “Star of the Month” Award for December 2007 at TCS Chennai.
  • Nominated for the TCS Gems for the work done towards the implementation of the project.

Scale Management Technology Development July 2004 – Apr 2006
Team Member / Developer
This project is about the Scale Labels printed on the pharmaceutical products sold in the stores. The data given by the Specialists were collected and stored in the database given by ADC Systems. The ETL jobs are designed in Cognos and transform the raw data into the ADC compatible format.


  • Involved in the Transformation part of the batch processing in the project.
  • Worked with ADC vendors and the source teams to visualize the flow of data and understood the conversion methodologies.
  • Estimated the cost and efforts of all phases of the development including the maintenance.
  • Involved in the Database design and Development and SOA Architecture.
  • Designed and developed the ETL batch jobs in Cognos using the Desginer and Director Modules.
  • Conducted extensive tests on the batch jobs and created the XML outputs which are fed into the ADC systems for loading the data into the tables.
  • Offered Knowledge Sharing Sessions to fellow team members.

Environment : COGNOS ETL Tools, Reporting, SOA, Oracle DI, HHP Quality Center, BMC Remedy – Incident Management Console, Problem Management Cosole, Remedy Knowledge Management, PVCS Dimensions for Change Control, IBM DB2, Mainframes, MS Visio, MS Office

We'd love your feedback!