We provide IT Staff Augmentation Services!

Technology Manager Resume Profile

5.00/5 (Submit Your Rating)

Connecticut, CT

Experience Highlights

  • Over 17 years of Industry experience in software development, Data warehouse, Architecture, Technology Leadership, and Project management.
  • Responsible to develop, maintain and manage the application development process.
  • Established and managed Data Integration COE and Shared Services.
  • Lead strategic Data Warehouse, Business Intelligence, and Development projects. Coordinated project status meetings, and resources planning, in all phases of data warehouses and SDLC.
  • Experience in solutions practice using Kimball, Inman, and hybrid architecture, Transactional and Dimensional data modeling OLTP, ETL, BI, Star Schema, and Snowflake schema.
  • Good understanding of the Financial Services Industry
  • Makes project related decisions and provides input into decisions impacting the broader team.
  • Subject matter expert across multiple technologies, architectures, and business applications with special emphasis on application/systems inter-dependencies.
  • Business/Application Knowledge demonstrates deep knowledge/expertise in multiple and inter-dependent business applications and processes.
  • Partner with development teams and clients and provide leadership in matrixed environment.
  • Expert in vendor relationship management.

Areas of Expertise

  • ETL Tools - Informatica PowerCenter 9.1, PowerEcxhange DB2-Zos, Oracle, MS SQL Server, MQ, JMS , Data Transformation, MetaData Manager
  • Big Data - Hortonworks Data Platform 2.0, 1.2, Hadoop, Hive, Pig, Sqoop
  • Pentaho Pentaho Data Integration 4.4, Business Analytics
  • Data Virtualization Composite 3.1
  • EAI - Tibco Business Works 5.0
  • Data Quality Trillium 9
  • Metadata manager MetaIntegration 5.1
  • BI Business Objects, Cognos
  • Databases - Oracle 11G, Exadata, Sybase ASE 4.9-12.5, MS SQL Server, DB2 UDB, DB2 Zos
  • Power Builder, DB Artisan, ER Studio, ERWin
  • Unix Solaris/AIX/HP-UX , C, Unix shell script, Perl, Autosys, Harvest, VSS
  • Java, XML ,HTML, VB 4.0-6.0, ASP, .net, Power Builder, Gupta SQL
  • EAGLE PACE, EAGLE Performance

Professional Experience

Confidential

Technology Manager

  • Managed global team of resources/developers both onsite and off-shore.
  • Implemented multitenant model for the Hadoop Eco system across the Enterprise.
  • In-depth knowledge in Big Data solutions and Hadoop ecosystem.
  • Architected and executed the successful completion of the FX execution Quality Analysis to identify various Trade patterns and performed comparative analysis against various Tick level data.
  • Led the Global Markets - FX Valuation Rate Analysis to provide a self-serve ad-hoc tool to slice and dice the FX data which provides various insights such as Net profit/loss, Total volume, Rates comparison etc.
  • Documented various Architectural Design Patterns.
  • Evaluated various technologies and conducted POC in the Big Data space.
  • Established a strategy for data archival leveraging Big Data ecosystem.
  • Designed the Data Lake Architecture as a centralized Data Hub to deliver data on demand to downstream applications. This is very effective in terms of storage cost and data access.
  • Implemented Data Governance Strategy for the Data Lake.
  • Currently executing Information Security Analytics initiative to detect anomalies, to get insight into various logs such as Proxy, Active Directory, Email etc.
  • Partnered with the Senior Business leaders in successfully executing strategic initiatives.
  • Worked directly with the Business stakeholders to address various ad hoc requests and to get more insights in the transactional trade data etc.
  • Mentored various members of my team, cross-team mentoring and college interns.

Environment: Hortoworks Data Platform 1.2/2.1.2, Hadoop, HDFS, Sqoop, Hive, HBase, Flume, Talend, 5.4, Teradata, Vertica, Infobright, Informatica PowerCenter 9.5, Tableau, Pentaho Kettle, Platfora, Splunk, Kafka, Storm.

Confidential

Technology Manager

  • Managed a team of resources/developers both onsite and off-shore.
  • Worked as a Lead liaison between business and IT.
  • Expert in vendor management.
  • Managed a highly performing team, attracted top talent, and actively supported innovative, faster and cost effective solutions.
  • Received the best implementation award for Custody information warehouse.
  • Provided Architecture guidance built the body of knowledge around PowerCenter, PowerExchange Oracle, DB2 zos , Data Transformation B2BDT .
  • Designed and Implemented Enterprise Shared Services and Center of Excellence for Data Integration from scratch.
  • Reviewed BRD's, HLD LLD designs documents, tracking of issues, staffing assignments and coordination of the team.
  • Created and evaluated critical information from multiple sources ODS , reconcile conflicts, decompose information into Star Schema details for Informatica ETL solutions practice.
  • Built ETL Governance framework with prescriptive guidance best practices for standardized repeatable enterprise data integration patterns.
  • Promoted ETL shared service for various projects within the Enterprise.
  • Developed and promoted various reusable components.
  • Researched Emerging technologies in the Data Integration domain to address various Business needs within the Enterprise.
  • Evaluated and Implemented Pentaho Data Integration Shared service and COE as an alternate ETL solution to reduce cost for Tier 2 applications.
  • Provided Architecture Guidance, Technology Leadership, best practices, High Level Design, Detailed Design, Led development efforts, SWAT support , COE services for the below Projects
  • CIW Custody Information Warehouse
  • Project L Bridgewater Back office Integration Project
  • GLS Global Liquidity System
  • GCX Global Class Action
  • AARC Annual Accounting Reporting Conversion
  • EB - Election Banking Reporting Engine
  • YS - Dreyfus Dealer Analytics
  • ERI Enterprise Risk Initiative
  • FedTic Worked closely with the FedTic development team to resolve various challenges and promoted the feeds to production.

Environment: Unix , ORACLE 11G, Exadata, Sybase, Informatica - Powercenter 9.1, 8.6.1, 8.1.1, Shell scripts, Hummingbird, Data Transformation 8.6, PWX for DB2 Zos, PWX for Oracle, JMS, MQ, Pentaho Kettle, Carte, Business Objects

Confidential

Technology Manager and Data Integration Architect

  • Lead a team of developers, Designed, implemented and managed centralized ETL Architecture for the Enterprise ETL shared service Informatica .
  • Involved in requirements gathering, analysis, design and implementation of various feed load process to the central data warehouse using star schema.
  • Implemented Transaction Strategy, reusable Error handling Strategy and captured load statistics for the data warehouse load process.
  • Implemented Restart strategy and recoverability for the ETL loads to the Data warehouse.
  • Developed a migration strategy and successfully implemented the java to informatica conversion project and achieved excellent performance for loading the Data warehouse.
  • Created Data Mapping and Transformations from ODS between source and data warehouse, further refined the star schema for Transactional, using Informatica as an ETL tool.
  • Guided the developers in tuning the mappings and the jobs to achieve best performance.
  • Carried out technical enhancements of varying size and complexity: gather requirements, size deliverables, estimate dates, architect solution, design, develop, plan QA phase and delivery strategy. Investigate and prototype new technologies evaluate how they can be applied to enhance infrastructure.
  • Evangelized the ETL Shared Service across CAI to reduce cost by leveraging for multiple applications.
  • Setting up the standards and best practices for the ETL developers.
  • Set the standards and policies for the division in consultation with overall organization, maintain and support the data warehouse. The responsibilities also include, create and implement disaster recovery strategies and testing, operational support strategies etc.

Environment: Unix Solaris 10, ORACLE 9i, SQL Plus, PL/SQL, Sybase, Informatica - Powercenter 8.1.1, 7.1.3, Shell scripts, Hummingbird, SQL Loader, Business Objects 11

Confidential

Data Integration Architect

  • Was involved in the design of the centralized ETL Architecture as a shared service for the entire firm globally using Informatica.
  • Designed the scalable ETL Architecture using Virtual IP and Global Dispatcher instead of Grid.
  • Setting up the standards and best practices for the ETL developers.
  • Team Lead of developers on tuning the mappings and the jobs to achieve best performance.
  • Hands on experience in installing the Repository Server, Power Center servers on Both Linux and windows environments.
  • Was involved in the development of mappings with all the transformations.
  • Analyzed sources, developed re-usable mapplets in Informatica Maplet Designer.
  • Extracted Data from various sources including Oracle, Flat files and loaded data into the Data warehousing database using Informatica.
  • Worked as an administrator for the ETL farm and also provided a stable environment.
  • Received best implementation award from Informatica.
  • Provided generic scripts for the ETL developers to run their jobs.
  • Designed the centralized EAI Tibco Business Works / EII Composite , Metadata Management Shared Service Architecture for the entire firm globally.
  • Setting up the standards and best practices for the EAI / EII developers.

Environment: Linux, Windows NT, ORACLE 9i, SQL Plus, PL/SQL, DB2 8.2, Sybase, Informatica - Powercenter 7.1.3, Shell scripts, Hummingbird, SQL Loader, Tibco Business Works 5.2 EAI , JMS/Tibco Messaging, Composite 3.2 EII , MetaIntegration for Metadata Management

Confidential

Data Integration Developer

  • database by using Informatica, PL/SQL scripts Unix shell scripting.
  • to integrate the data from various systems, flat files to the staging area and loading into the target
  • Stored procedures, triggers, SQL Loader. Designed and implemented ETL code necessary
  • A 7 Terabyte large data warehouse, consisting of data from 8 different Clients. This system is used to analyze credit background of customers of the direct marketing companies and provide them with promotional offers to the potential customers. Used Trillium for data cleansing, standardization and matching names and address.
  • Was involved in the implementation of Data loading using ETL tool - Informatica, PL/SQL,
  • Designed the Data warehouse using Kimball's Star Schema methodology.
  • Analyzed sources, developed simple and re-usable transformation mappings in Informatica Designer.
  • Analyzed sources, developed simple and re-usable mapplets in Informatica Mapplet Designer.
  • Designed and developed mapping using Source Qualifiers, Expressions, Routers, Sorter,
  • Aggregator, Normalizer, Filters, Stored procedures, functions and Update strategies.
  • Performed server-side activities including creation of database client connections, and
  • designing Informatica batches and sessions to load data into target tables.
  • Extracted Data from various sources including Cobol files, Flat files and loaded data into the
  • Data warehousing database using Informatica.
  • Worked on STRIVA DETAIL Power Connect for Mainframe to load EBCDIC files.
  • Performed Unit Tests.
  • Extensively worked on Trillium to standardize Matching Names and address and cleanse data.
  • Worked on SAS 9 to generate reports and data Analysis.

Environment: Unix, Windows NT, ORACLE 9i, SQL Plus, PL/SQL, DB2 8.1, Informatica - Powercenter 6.2, TOAD, Unix Shell scripts, Hummingbird, SQL Loader, ERWIN, Trillium 6 for Data Quality Name and Address Standardization, Detail, SAS 9

Confidential

PowerBuilder Lead Developer

  • The Scope of this system is to provide the users, screens to maintain the enrollment of students from different Dealers for training from various departments like Technical, Sales and Service. This
  • system also has a Billing Module to bill the dealers for the training they had at Mercedes. This
  • system has the feature of faxing documents directly to various dealers using Right Fax. This
  • system generates various reports for Technical training Confirmation using RTE data window.
  • Involved in the full life-cycle development of the project. Was involved extensively in writing
  • stored procedures for Sybase. Involved in optimization of the queries to improve performance. Involved in business process analysis, Program specifications, development, testing and
  • Subsequent implementation. Worked on PowerBuilder Foundation Class Library, Non-Visual
  • User Object, and Datastore, Rich Text Edit Controls in PowerBuilder. Extensively used object inheritance and
  • other concepts of object oriented programming.

ENVIRONMENT: Windows 2000, Sun Solaris, Sybase SQL Server 11.5.1, PowerBuilder 8.0, 7.0.2, SQL, Unix

  1. Campaign Cost Analysis Client: International Master Publishers, CT

The Scope of this system is to provide the users with the tools to create the campaign mail piece and its associative cost data within the Marketing Information System MIS Application. This system allows the Marketing Dept. to create the Campaign Drops and for Operations Dept. to price the package components that make up the Campaign Drop's Mail Piece. Once the mail piece is priced, the Campaign's Matrix can be viewed and approved along with each Mail Piece Component. Campaign Costing is an Operations function and the Marketing group will only be able to create campaign drops. Operations and Marketing put together the concept requirements for each Campaign Drop, and Operations assigns the package components to those concepts thus creating the mail piece.

Involved in writing stored procedures, Packages and Triggers in Oracle. Involved in the optimization of the queries to improve performance. Involved in business process analysis, Program specifications, development using PowerBuilder and subsequent implementation of Executable Packages, testing plans and testing of the system. Was involved extensively with requirements gathering, and solutions design and preparing documents for the system. Worked on PowerBuilder Foundation Class Library, Non-Visual User Object, and Datastore. Extensively used object inheritance and other concepts of object oriented programming.

ENVIRONMENT: Windows NT 4.0, HP - UX 10.0.2, Oracle 8i, PL/SQL, SQL Plus, SQL Loader, PowerBuilder 6.5, PVCS

We'd love your feedback!