We provide IT Staff Augmentation Services!

Teradata/etl/solutions Architect Resume

3.00/5 (Submit Your Rating)

Atlanta, GA

SUMMARY:

  • Over 13 years of professional experience in Information Technology with extensive experience in Design, Data Modeling, ETL Architecture, Development, Maintenance and Support, Business Discovery and implementation of Enterprise Data Warehousing (EDW) and Data Integration Projects.
  • Teradata V2R5 Certified Master & IBM Certified Solution Developer - InfoSphere DataStage v8.0
  • Extensively worked in an Agile/Waterfall environment.
  • Experienced in all the phases of the Software Development Life Cycle including Analysis, Coding, Design & Architecting, Testing and implementation with thorough knowledge of Quality Processes used in software development and implementation.
  • Experienced in migrating and building the ETL components from Ab-Iniatio/DB2/DataStage to Informatica/Teradata/WHERESCAPE RED.
  • Successfully delivered multiple Enterprise Data Warehouse projects meeting aggressive development schedules by effectively leading the team and appropriate risk assessment.
  • Experience including analysis, modeling, design, and development of Tableau reports and dashboards for analytics
  • Extensive experience in Tableau Desktop, Tableau Server and Tableau Reade
  • Extensively worked with business users/SMEs as well as senior management.
  • Extensively worked on various domains including Healthcare, Insurance, Finance, Banking, Telecommunications and Retail.
  • Expert in Preparing the Design Documents, System Administration Documents, Test Plans & Test Scenarios/Test Cases and Test Results Document.
  • Experienced in Designing and Preparing Training Material and Conducting Training Sessions to the users.
  • Translating business requirements into specific system, application or process designs in the context of highly complex projects involving multiple systems/applications. Applies advanced knowledge of the business and IT architecture principles to identify and evaluate alternatives solutions
  • Expert in WHERESCAPE RED ETL tool integration.
  • Expert in both 3NF relational modeling and dimensional model including star-schema and snow-flake schema.
  • Ability to multi-task and work in a highly dynamic environment with multiple teams comprised of technical and business resources.
  • Extensive experience in writing Functional Specifications, translating business requirements to technical specifications, created/maintained/modified data base design document with detailed description of logical entities and physical tables
  • Experienced in developing Business reports by writing complex SQL queries using views, macros, volatile and global temporary tables.
  • Experienced on Forward and Reverse engineering processes. Created DDL scripts for implementing Data Modeling changes. Created ERWIN reports in HTML, RTF format depending upon the requirement, published Data model in model mart, created naming convention files, co-coordinated with DBAs to apply the data model changes.
  • Experienced with Physical Data Modeling, usage of Partitioned Primary Index, choosing Secondary Indices, Join Index, Permanent journaling, Compression Analysis for space recovery, implementing multi value compression.
  • Knowledge on Teradata Telecom,FS-LDM, CLDM, Banking Domain and Telecom Domain.
  • Recommend Data modeling changes and ETL changes to improve maintainability, data quality, best practices and performance.
  • Expert in developing scripts for Teradata MLOAD, FASTLOAD, FASTXPORT, BTEQ, TPT.
  • Extensively worked in both UNIX/LINUX (AIX/HP/Sun Solaris) and Windows (Windows NT/2000) platforms and expert in UNIX Shell Scripting.
  • Strong ETL knowledge of Informatica PowerCenter v9/8.6/7.x/6.x/5.x, Datastage 7.x/8.x, Teradata Tools and utilities and Strong background in Oracle 10/8i, PL/SQL and SQL.
  • Extensively used TERADATA utilities - Teradata Parallel Transporter (TPT)/ FastLoad / MultiLoad /TPump / BTEQ scripting/Fast Export SQL Assistant, Database Query Log (DBQL) and Database Query Management (DBQM).
  • Expert in design and development of Informatica mappings/sessions/workflows to move data from source to reporting data warehouse for complex MSI Business Intelligence project in Financial Sector (Insurance, Banking).
  • Strong skills in Coding and Debugging Teradata utilities like Fast Load, Fast Export, MultiLoad and Tpump for Teradata ETL processing huge volumes of data throughput.
  • Proficient in Collect Statistics, Database Sizing, Capacity Planning, Database Performance Monitoring, Database Backup/Recovery and SQL Query Tuning.
  • Expert in data extraction, transformation and load from disparate data sources using Informatica Power Center Designer (versions 9.0/8.5) framework and standards.
  • Experienced in designing and developing of Extract Transform and Load (ETL/ELT) processes on Teradata, Informatica Power Center Designer (versions 9.0/8.5) framework & standards
  • Experienced in Teradata Database Management/Administration Activities like (User Creation, Role Creation, Profile Creation, Monitoring Database & Table level growth and assigning space proactively before or after loading, Collecting statistics other than PI/UPI in order to optimize the query performance if required, Monitor and report on the growth of databases, Monitor and adjust spool space utilization and requirements, Setup and maintain access rights to the Teradata system database objects, Create and alter tables, views, macros and other database objects as required and Performance Tuning.
  • Designed and implemented Performance improvement changes to nightly ETL processes.
  • Expertise in Performance Tuning of data flow processes (ETL mappings) and loading high volumes of data on daily basis into data warehouse in a given load window using Informatica Partitioning and Pushdown Optimization.
  • Experienced the Department wide Informatica upgrade from version 7 to version 9 and Teradata Upgrade from 12 to 13.
  • Proficiency in Prioritizing and Multi-tasking to ensure that tasks are completed on time.
  • Trained development team in Informatica Version 8/9 & Teradata 12/13 upgrade process and new features.
  • Adhere to company and Project Standards and Guidelines.
  • Ability to work on multiple projects/tasks simultaneously to meet dead lines
  • Performed project planning and estimation. Coordinated all aspects of the project. Managed the projects’ scope of work, schedule, and budget. Capable of balancing long and short-term priorities.
  • Good work ethics and “can-do” attitude with excellent Analytical, Programming, Problem solving, communication & interpersonal skills.
  • Excellent ability with competitive spirit in finding various approaches for problem solving and highly skillful, creative and innovative in developing efficient logic/code.
  • Quick adaptability to New Technologies and zeal to improve technical skills.

TECHNICAL SKILLS:

ETL Tools: Wherescape RED,Informatica PowerCenter/PowerMart 9.1 /9/8.6.1/7.1.3/6.1/5.1 Datastage 7.x/8.x,Teradata Load/Unload Utilities BTEQ, FastLoad, Multiload, Fast Export, Tpump, TPT

Teradata Tools & Utilities: Teradata View Point, Visual Explain, Teradata System Emulation Tool, Teradata, SQL Assistant, Teradata Statistics Wizard, Teradata Dynamic Workload Manager, Teradata Query Scheduler Admin.

RDBMS: Teradata V14,V13.1, V12, V2R5.xw2/6.x, Oracle 8i, 9i, 10g, MS SQL Server, MS-Access

Operating System: MS Dos, HP-UNIX, Windows 95/98/2000/XP/NT, MVS, LINUX

Scheduling Tool: CONTROL-M, WLM (Work Load Management), IBM Tivoli ( Maestro), ESP, UNIX KORN

Data Modeling Tool: ER Studio 7.1.1, ERwin 4.0/3.5.2/3. x

OLAP: Microstrategy 8.0.2, Business Objects 5.1,Cognos, Visualization Tools Tableau Desktop 8.2/8.1/7, Tableau Server 8.2/7,Tableau Reader

Languages: SQL, C and UNIX Shell Scripting

Data Processing Tools: Bteq, FastLoad, MultiLoad, Fast Export and TPUMP

Database Tools: SQL Assistant and Toad

Others: Teradata CRM (TCRM)s

PROFESSIONAL EXPERIENCE:

Confidential, ATLANTA, GA

Teradata/ETL/Solutions Architect

Responsibilities:

  • Played the role of an Confidential & Confidential alliance program Teradata Technical Lead/ETL Architect and was responsible for designing the ETL Strategy & Architecture of the Project.
  • Responsible in managing tasks and deadlines for the ETL teams both Onsite and Offshore.
  • Initiated and developed the End to End Roadmap solutions for Alliance Business initiatives like (Wireless, Uverse, Enabler, DTV, Cricket, ABS (ATT Business Solutions).
  • Contributed to solution engineering, project coordination, business requirements analysis, requirement specification, and project documentation
  • Point of Contract on the ETL team for other teams such as Reporting, Testing, QA and Project Management for updates on Project Status and issues.
  • Understanding the Cross Product (Telecom and Entertainment Group) domain functionality.
  • Leading ETL design and development using multiple industry standard reporting and dashboard tools
  • Translating and transforming business requirements into data models (Conceptual, Logical and Physical)
  • Conducted impact assessment and determine size of effort based on requirements.
  • Developed data models for database structures using Data modeling tools such as ERWIN / ER studio
  • Implemented the data architecture and design for Alliance - EG ( Entertainment Group )
  • Prepared presentations on EDW/ECDW/ALLIANCE and socialize with multiple teams
  • Reviewing all Project Level Data Movement Designs for adherence to Standards and Best Practices · Suggest changes to Project Level Designs.
  • Working with Scrum master on user stories and break up the work into task list and estimate based on simple, medium and complex methodology
  • Interaction with business users and requirement gathering.
  • Building High level & low-level ETL flow design
  • Presenting the design with Business Team.
  • Involved in detailed technical design
  • Give functional KT to the QA team.
  • Was involved in conducting the review of Teradata/Informatica Code, Unit Test Cases & Results with the Developers.
  • Organize daily/weekly technical discussions with the Onsite team also including the individual offshore work stream leads and set expectations for offshore delivery.
  • Extensively used Informatica Pushdown from loading Stage tables to Foundation tables
  • Optimized Query Performance, Session Performance and Reliability
  • Helping the other members of the business intelligence team and executing work with reference to ETL
  • Coordinating with the end user gathering to ameliorate the overall individual architecture expertise.
  • Coordinating with the other IT teams to collect design needs and making specialized structured written papers based on the operative needs
  • Assisting the users and writing and producing specialized particularities
  • Making the basic ETL architectural rules and regulations
  • Assisting the structural and specialized designs of the schemes to make sure that the judgments are being made in accordance with the existing and forthcoming business plans of actions and chances
  • Owning the architectural production actions related to Alliance ETL projects
  • Performing other job duties and roles Confidential the guidance of the management
  • Supervising and guiding ETL architectural implements

Confidential, RICHMOND, VA

Sr Teradata Solutions/Data/ETL Architect

Responsibilities:

  • Played a key role in Altria Client Services Migration & Maintenance and DATA LAKE implementation projects.
  • Implemented the changes like e-Vapor cartridge volume allocation, calculating the physical Sales geography, Brand Master changes, Altria Shipment analysis, Integrating the new Trade Programs, Building the Volume for all brand masters by category ( Cigg,Cigar,Smokless).
  • Developed proofs-of-concept and prototypes to help illustrate approaches to AALIMS and DATALAKE.
  • Implemented the reconciliation process of Store and Shipment (Ciggarates,Cigars and Smokeless) information between the MSA and ALTRIA.
  • Migrating the existing Teradata BTEQ scripts into the WHERESCAPE-RED ETL tool and building the ETL Process in WHERESCAPE - RED tool.
  • Implemented the New DATA LAKE process in WHERE SCAPE RED ETL tool.
  • Developed Dashboard reports for the Key Performance Indicators for the top management.
  • Created different visualizations using Bars, Lines and Pies, Maps, Scatter plots, Gantts, Bubbles, Histograms, Bullets, Heat maps and Highlight tables.
  • Involved in creating dashboards by extracting data from different sources.
  • Created dashboard using parameters, sets, groups and calculations.
  • Involved in creating interactive dashboard and applied actions (filter, highlight and URL) to dashboard.
  • Involved in creating calculated fields, mapping and hierarchies.
  • Analyzing the Altria Inventory Model on monthly basis and investigating the volume differences.
  • Cleaning the Location Subject Area and keeping the accurate geographical information.
  • Implemented the Data Stage to Teradata /Oracle Migration of AALIMS (Laboratory Information Management System) application.
  • Implemented the DATA LAKE to support the current Altria Analytics,
  • Load the data into the Teradata database using Load utilities like (Fast Export, Fast Load, MultiLoad, and Tpump).
  • Tuning the user queries and frequently used SQL operations and improve the performance.
  • Resolving various Defects in Set of wrapper Scripts which executed the Teradata BTEQ, MLOAD, FLOAD utilities and Unix Shell scripts.
  • Created/Enhanced Teradata Stored Procedures to generate automated testing SQLs.
  • Responsible for designing and implementing a project’s technical architecture and technical design documents to communicate solutions that will be implemented by the development team
  • Conducting workshops and other customer interactive events work with business teams and technical analysts to understand business requirements.
  • Serve as a technical resource for business teams to help define, estimate, and propose solutions for business opportunities.
  • Worked on Multi Value Compression Analysis and identifying the skew tables and taking the necessary action.
  • Worked on Performance Tuning activities.
  • Leading the technical aspects of an Agile or waterfall based project delivery and responsible for project timelines.
  • Responsible for defining technical solutions.
  • Provide Logical/Physical modeling along with Design services.
  • Responsible for whole Project technical implementations and Planning.
  • Lead Requirements review and assessment of work
  • Performed Solution and Architecture reviews
  • Providing technical inputs to development teams.
  • Perform development as needed.
  • Coordinating with other teams/vendors for overall project deliverables.
  • Coordinating with business partners and other technical leads for the solution

Confidential, COLUMBUS, OH

Sr Teradata Solutions/Data/ETL Architect

Responsibilities:

  • Played a key role in JPMC RFS Migration CST/CDM marts project to build the Customer Mart.
  • Understanding the requirement and building the new ETL Code in Infomratica by referring to the existing Ab-Initio Code for each logical part of the existing EDW system.
  • Preparing the Reverse Engineering documents by analyzing the Ab-Initio graphs.
  • Preparing the Mapping Documents and building the ETL Code to Load the data into the Staging tables.
  • Addressing the Complex issues during the EDW to ICDW migration process.
  • Involved in End-to-End ITSM change control Process for UAT and Prod.
  • Participating in the release management activities and discussions.
  • Conducting the Internal Peer reviews.
  • Working closely with the Performance testing team and addressing al the issues raised by them and getting the components signed off.
  • Working with the App Dev Support and EDW/ICDW Prod Support team to address the current production issues as well the ETL flow execution.
  • Responsible for architecting the ETL Components (Location Subject Area) and Creating design specifications, ETL design documents.
  • Worked with the source systems developers and business owners to understand the data sources for defining data extraction methodologies and made decisions on appropriate Extraction, Transformation and loading Strategies.
  • Worked on Multi Value Compression Analysis and identifying the skew tables and taking the necessary action.
  • Worked on Performance Tuning activities.
  • Creating the database objects, database performance tuning and monitoring,
  • Develop/enhance ETL processes to load data into Integration and Semantic Layer.
  • Create, maintain, and manage documentation for data mart and report development efforts.
  • Responsible for adhering to business intelligence standards and best practices, supporting Data management and other lines of business.
  • Designed and Developed ETL Solutions (Informatica workflows, Tivoli Schedules Unix shell scripts) to support operational management of data solutions.
  • Hands on experience with Teradata SQL, Utilities like (BTEQ, TPT, and FAST Load).
  • Provided production support of data solutions, including troubleshooting production failures, script revisions and reviews, status notifications and on-going support of development activities.
  • Designed and developed Informatica workflow designs to support execution of TPT scripts that will be used to load data into departmental data marts.
  • SQL Code performance and tuning.
  • Designed and build out of Tivoli /Control M schedules for the purpose of scheduling one time and recurring executions of the Informatica workflows.
  • Documented technical/operational processes (Informatica, Tivoli, Control M, etc…) and run book procedures.
  • Designed and build-out of Unix shell scripts, as needed.
  • Work with end user and internal teams to define business, data and technical requirements.
  • Leading discussions related to technical processes for the purpose of root/cause analysis and determining process activities, which must be performed (and in what sequence) in order to execute load processes and/or recover from failures.

Confidential, Richmond, VA

Sr Teradata Solutions/Data/ETL Architect

Responsibilities:

  • Participated in requirement gathering session with business users and sponsors to understand and document the business requirements as well as the goals of the project
  • Responsible for architecting the ETL lifecycle and Creating design specifications, ETL design documents.
  • Worked with the source systems developers and business owners to understand the data sources for defining data extraction methodologies and made decisions on appropriate Extraction, Transformation and loading Strategies
  • Analyze & translate functional specifications & change requests into technical specifications.
  • Identified existing/new facts and dimensions from the source system and business requirements to be used for this project and Perform impact analysis, identifying gaps and code changes to meet new and changing business requirements.
  • Coordinating with Onshore and offshore team on daily.
  • Identifying the Candidate columns to collect the stats and documenting the same
  • Preparing the Joins Report
  • Preparing the Database object workbook which keeps track of all the database object requirements through-out the SDLC
  • Preparing the Release Document which includes all the New Objects information for each environment
  • Maintaining the New/Modified Model changes and getting it approved in the Change Control Meetings with the help of Business Analyst
  • Publishing the Physical Model to DBA team after the proper approvals
  • Analyzed the complex ETL requirements/tasks and provided estimates etc…
  • Created complex mappings in Power Center Designer using Aggregate, Expression, Filter, Update Strategy, Look Up, Parser, and XML Source Qualifier etc…
  • Worked on the Power Exchange to pull the data from the mainframe.
  • Implemented Slowly Changing Dimensions as per the requirement.
  • Worked closely with the ITT team to perform the Unit, Integration, Functional and Performance testing.
  • Planning and executing the unit tests and validating expected results; iterating until test conditions have passed.
  • Creating the Environment Set up sheet as per the project need
  • Extensively involved in the performance tuning effort of TERADATA SQLs to make the code more efficient and meet timelines.
  • Reviewing the Coding/DBA Standard Check List prepared by the developers
  • Reviewing the code and getting the required approval
  • Creating the Work Order ticket in TSRM to set up the SIT Environment
  • Checking the code in the Clear Case Stream
  • Involved in the process to baseline the code
  • Preparing the Implementation Work book, Database objects implementation book, Infrastructure change request form, master stats sheet and connection string form
  • Reviewing the entire SDLC Process with the team
  • Getting the approvals from the Project DBA teams after the SIT data loads.
  • Involved in the Code move to UAT from SIT and making sure that all the required documents are updated
  • Conducting the SDLC UAT Review meeting
  • Working closely with the Release Test Team and getting the approvals
  • Preparing the Production Implementation plan.
  • Performed migration of mappings /workflows from Development to Test and to Production Servers.
  • Involved in Data Quality, Data profiling, Data Cleansing and metadata management
  • Preparing the Implementation plan and sharing across the team and the release management.
  • Tuned performance of Informatica session for large data files by increasing block size, data cache size, sequence buffer length and target based commit interval.
  • Developed the Teradata Load Scripts (BTEQ/MLOAD/FLOAD)
  • Modifying existing code to provide defect fixes for existing ETLs
  • Recommending improvements to data architecture processes to ensure high quality of data architecture deliverables and consistency.
  • Guiding the other team members on considerations in the design of ETL architecture building blocks.
  • Solving the moderately complex issues in ETL design.

Environment: Teradata v13, UNIXHP-UX, Windows NT,BTEQ, Teradata Parallel Transporter(TPT), TMULTILOAD, TPUMP, FastExport and FASTLOAD and SQL Assistant, SAP Business Objects, Erwin 8.0.

Confidential, Columbus, Ohio

Sr.Teradata/ETL Lead Consultant

Responsibilities:

  • Participated in requirement gathering session with business users and sponsors to understand and document the business requirements as well as the goals of the project.
  • Involved in Data Quality, Data profiling, Data Cleansing and metadata management
  • Worked on requirements gathering, architecting the ETL lifecycle and creating design specifications, ETL design documents.
  • Helped the data analyst team and providing the existing data behavior information.
  • Identified various facts and dimensions from the source system and business requirements to be used for the data warehouse.
  • Implemented the Slowly changing dimension scheme (Type II) for most of the dimensions.
  • Implemented the standard naming conventions for the fact and dimension entities and attributes of logical and physical model.
  • Created the DDL scripts using ER Studio and source to target mappings (S2T- for ETL) to bring the data from multiple sources to the warehouse.
  • Reviewed the conceptual EDW (Enterprise Data Warehouse) data model with business users, App Dev and Information architects to make sure all the requirements are fully covered.
  • Analyzed large number of COBOL copybooks from multiple mainframe sources (16) to understand existing constraints, relationships and business rules from the legacy data.
  • Reviewed and implemented the naming standards for the entities, attributes, alternate keys, and primary keys for the logical model.
  • Reviewed the logical model with application developers, ETL Team, DBAs and testing team to provide information about the data model and business requirements.
  • Worked with Other ETL team members and DBA to create the Mapping document and physical model and tables.
  • Developed the ETL process /Design Document for this project.
  • Coordinating with source system owners and day-to-day ETL progress monitoring.
  • Create the mappings using transformations such as the Source qualifier, Parser, Normalizer, Aggregator, Expression, Lookup, Router, Filter, and Update Strategy
  • Created reusable transformations and mapplets to use in multiple mappings.
  • Designed and Developed the ETL/Scheduling, Generic Script Building Process to load the relationship table
  • Designed and Developed the Informatica Mappings/Mapplets/Workflows and Bteq Scripts.
  • Created various re-usable objects to simplify the data processing.
  • Perform impact analysis, identifying gaps and code changes to meet new and changing business requirements.
  • Prepared the Historical data load approach.
  • Created the JNDI connections and the respective binding file to read the MQ thru Informatica.
  • Performed Analysis/design/development/unit testing for PowerCenter upgrade from 8.6 to 9
  • Analyzed UNIX ksh scripts to eliminate redundancy
  • Created couple of common scripts and Mapplets.
  • Helping the infrastructure team during the migration process
  • Worked with Teradata DBA to create the physical model and tables.
  • Scheduled multiple brain storming sessions with DBAs and production support team to discuss about views, partitioning and indexing schemes case by case for the facts and dimensions.
  • Worked on the model based volumetric analysis and data based volumetric analysis to provide accurate space requirements to the production support team.
  • Participated in UAT sessions to educate the business users about the reports, dashboards and the BI System.
  • Worked with the test team to provide insights into the data scenarios and test cases.
  • Tracked the defects in the Quality Center and updated the status in a timely manner during the entire phase of the testing life cycle.
  • Ensured all project documentation has been stored electronically whenever needed during project.
  • Creating the design specification document & Implementation Documents
  • Preparing the Implementation plan and sharing across the team and the release management.
  • Providing input to the Project Manager during the implementation plan preparation,
  • Worked on HP Quality Center to track the defect logged against the logical/physical model and ETL Process.
  • Worked with client and off shore team to make sure that the reports and dashboards are delivered on time.
  • Supporting the Current Production flows on 24x7 by coordinating with the offshore team.
  • Communicating the Status Updates / Production abends / Resolution details / Risk factors to the Support team & Galaxy Team.
  • Coordinating with the Galaxy teams by conducting the touch points during the critical situation of the production issues.
  • Involved in identifying the performance bottlenecks and optimizing respective databases and processes for performance improvement.
  • Preparing the Incident report on monthly basis and sharing across the team and discussing with the team for the improvement on tickets resolutions.
  • Composing the Maestro schedules and Jobs and compiling on the respective environment.

Environment: TERADATA, INFORMATICA, ORACLE, IBM Tivoli and UNIX

Confidential

Sr ETL/Teradata Lead Consultant

Responsibilities:

  • Involved in Analysis, Requirements Gathering, SRS (Software Requirement Specifications) & HLDD (High Level Design Document) & LLDD (Low Level Design Document).
  • Developed the Data stage Jobs/ Sequences based on the technical specification provided by the Analyst Team for the specific Domain.(like .CRM, Payments and etc…)
  • Analyzing and Loading the EBCIDIC and ASCII Source files into datasets by using data stage.
  • Loading the Datasets into Oracle Tables and performing the testing.
  • Performed the Unit testing/Peer Reviews once the Data Stage job development is completed.
  • Delivering the deliverable as per the schedule with out fail.
  • Helping in resolving the other team members Technical Issues.
  • Converting the Data (i.e. converting the Confidential data into ABN AMRO format).
  • Fixing the defects that have been raised through the Quality Center and Implementing the Change Requests.
  • Promoting data stage objects from environment to environment.
  • Maintaining the Migration framework / Reconciliation Frame work / Technical Validation Frame work / Oracle Control tables required for this project.
  • Maintaining the reject file with zero records, In general any record from the source files that doesn’ Confidential fit the corresponding file definition will be written to a reject file.
  • Validating the Full volume of data that has been loaded into Data stage Datasets according to a set of technical rules, Cluster Selection, the filtered data.

Confidential

Sr. ETL/Teradata Lead Consultant

Responsibilities:

  • Coordinating with source system owners and day-to-day ETL progress monitoring.
  • Involved in Analysis, Requirements Gathering, SRS (Software Requirement Specifications) & HLDD (High Level Design Document) & LLDD (Low Level Design Document).
  • Involved in analyzing existing logical and physical Data modeling using Erwin.
  • Responsible for creating & running SQL scripts for DDL, DML operations on Teradata/Oracle DB.
  • Designed the procedures for getting the data from all systems to Data Warehousing system. The data was standardized to store various Business Units in tables.
  • Involved in data migration to import legacy data from one system to the other.
  • Analyzed business requirements and worked closely with the various application teams and business teams to develop ETL procedures that are consistent across all applications and systems.
  • Developed the Teradata Load Scripts (BTEQ/MLOAD/FLOAD), Informatica Mappings with the help of Source Mapping provided by the DBS WA team.
  • Involved in Informatica Power Exchange which scales effectively to support high volume batches to low latency complex data.
  • Very strong in data analysis and ETL solution design and development
  • Widely used Informatica client tools - Source Analyzer, Warehouse designer, Mapping designer, Transformation Developer and Informatica Work Flow Manager.
  • Used Transformations like look up, Router, Filter, Joiner, Stored procedure, Source Qualifier, Aggregator and Update strategy extensively.
  • Tuned performance of Informatica session for large data files by increasing block size, data cache size, sequence buffer length and target based commit interval.
  • Performed Pipeline partitioning to optimize the performance of mappings.
  • Created Mapplet and used them in different Mappings.
  • Partitioned Sessions for concurrent loading of data in to the target tables.
  • Performed incremental aggregation to load incremental data into Aggregate tables.
  • Created Schema objects like Indexes, Views and Sequences.
  • Developed stored procedure to check source data with Warehouse data and if not present, write the records to spool table and used spool table as Lookup in Transformation.
  • Used workflow manager for session management, database connection management and scheduling of jobs.
  • Sharing the Ad hoc and weekly reports of the project to the project stake holders and the BDW Management team and Coordinating with DBS PM/IT Lead on a day to day basis to update the status of project activities Confidential offshore location.
  • Involved in writing UNIX shell scripts for Informatica ETL tool to automate sessions and cleansing source data.
  • Involved in extensive performance tuning by determining bottlenecks Confidential various points like targets, sources, mappings and sessions.
  • Created Sessions and batches to move data Confidential specific intervals & on demand using Workflow Manager.
  • Involved in the process design documentation of the Data Warehouse Dimensional Upgrades.
  • Created User Interface reports for validating the Data through Reporting Services.
  • Used PMCMD to run workflows and crontab to automate their schedules.
  • Was responsible for migration/conversion of Informatica PowerCenter from Informatica 7.1.3 to Informatica 8.5.
  • Involved in upgrading and configuring Informatica Power Exchange from 8.5 to 8.6.
  • Creating Test cases for Unit Test, System Integration Test and UAT to check the data quality.
  • Involved in Promotion Change Control Methodology.
  • Developed views necessary for structured and ad-hoc reporting.
  • Involved in Version control of code from development to Test and Production environments using Change man.
  • Used Rational Rose to model the process using UML to create behavioral and structural diagrams.
  • Generated reports using Business Objects Report Designer.
  • Prepared the Job sheet to schedule the Jobs in TIVOLI scheduler tool
  • Ensure Backup of code on a weekly basis
  • Perform program development and review if necessary
  • Assist and support DWCC teams and users during SIT, UAT and Warranty Period.

Environment: TERADATA, INFORMATICA, ORACLE, IBM Tivoli and UNIX

Confidential

Sr Teradata Lead Consultant

Responsibilities:

  • Involved in Systems Study and Analysis and understand the business needs and implement the same into a functional database design.
  • Defined the ETL strategy for Data Warehouse population.
  • Involved in Data Quality Analysis to determine cleansing requirements.
  • Maintained warehouse metadata, naming standards and warehouse standards for future application development.
  • Performed extensive analysis of metadata to test the integrity, consistency and appropriateness of the data to be brought into the centralized Data from various sources.
  • Installed, Maintained and Documented the Informatica Power Center setup on multiple environments.
  • Designed the procedures for getting the data from all systems to Data Warehousing system. The data was standardized to store various Business Units in tables.
  • Worked on Informatica Power Center 6.2 and created Informatica mappings with PL/SQL procedures/functions to build business rules to load data. Most of the transformations used were like the Source qualifier, Aggregators, lookups, Filters & Sequence generators.
  • Created sessions and batches to move data Confidential specific intervals & on demand using Workflow Manager.
  • Developed the Teradata Load Scripts (BTEQ/MLOAD/FLOAD)
  • Extensively worked on the Database Triggers, Stored Procedures, Functions and Database Constraints. Written complex stored procedures and triggers and optimized for maximum performance.
  • Created UNIX shell scripts (Scheduler utilities) for automating the backup of Database/ Transaction log.
  • Identifying the major entities in the databases and preparing the E-R diagrams depicting major attributes and the relationship between these entities.
  • Preparing a high-level logical dataflow diagram for THE BANK as a whole in addition to business unit wise data flow diagrams.
  • Collecting the Business Metadata (confined to data elements relevant for EDW) and deliver the same in the form of a database (MS Excel).

Confidential

Teradata Developer

Responsibilities:

  • Analyzed the Data Mapping Sheets provided by the client.
  • Developed both Technical and functional specifications/Design Document through meetings with Business Analysts, Data modelers, users and DBA team.
  • Primary responsibility comprises of development of loading scripts for all the source systems for Gujarat, Karnataka, Chennai, Haryana, Punjab, Andhra Pradesh, Uttar Pradesh as well as Rest of Bengal and BPL circles.
  • Understanding the Data mapping Document and building the Teradata load scripts.
  • Loading the data from source systems into Teradata Warehouse by using Teradata Tools and Utilities, CRs(Change request) Implementation for all Circles.
  • Prepared the Data Purging process for Every 2 Months and maintained the Hutch Data warehouse (EDW).
  • Proactive problem detection and correction for system related issues and resolving the Issues during the Data loading Process.
  • Tuning the existence process for better Performance.
  • Monitoring the Teradata load process by using Teradata PMON.
  • Developed the UNIX Shell Scripts
  • Create and maintain Database, Users, Tables, Views, Stored Procedures and Macros using established processes
  • Schedule backup and recovery processes and Monitor scheduled back-ups / archives
  • Manage allocated disk space - database, user work, load work, and spool space
  • Respond to unexpected database problems on an on-call basis
  • Implementing the Change Requests for EDW.
  • Participated in business requirement gathering sessions.
  • Worked on identifying facts, dimensions and various other concepts of dimensional modeling which are used for data warehousing projects.
  • Trained in Inmon and Kimball approaches for data warehouse design.
  • Worked on Normalization and Denormalization techniques.
  • Trained on building Conceptual, Logical and Physical data model.
  • Defined relationships and cardinalities among entities.
  • Developed queries using PL/SQL and many stored procedures to do the validations.
  • Created and maintained Database Objects (Tables, Views, Indexes, Partitions, Database triggers etc).
  • Analyzed the Data Mapping Sheets provided by the client.
  • Developed both Technical and functional specifications/Design Document through meetings with Business Analysts, Data modelers, users and DBA team.
  • Primary responsibility comprises of development of loading scripts for all the source systems for Gujarat, Karnataka, Chennai, Haryana, Punjab, Andhra Pradesh, Uttar Pradesh as well as Rest of Bengal and BPL circles.
  • Understanding the Data mapping Document and building the Teradata load scripts.
  • Loading the data from source systems into Teradata Warehouse by using Teradata Tools and Utilities, CRs(Change request) Implementation for all Circles.
  • Prepared the Data Purging process for Every 2 Months and maintained the Hutch Data warehouse (EDW).
  • Proactive problem detection and correction for system related issues and resolving the Issues during the Data loading Process.
  • Tuning the existence process for better Performance.
  • Monitoring the Teradata load process by using Teradata PMON.
  • Developed the UNIX Shell Scripts
  • Create and maintain Database, Users, Tables, Views, Stored Procedures and Macros using established processes
  • Schedule backup and recovery processes and Monitor scheduled back-ups / archives
  • Manage allocated disk space - database, user work, load work, and spool space
  • Respond to unexpected database problems on an on-call basis.

We'd love your feedback!