Data Architect Resume Profile
NJ
SUMMARY:
- Innovative professional with over 15 years of progressive experience in the IT industry with prime focus on Analysis, Design, Development, Customization, and Maintenance of various Business Applications in an effective way to deliver leading edge software solutions.
- Project Management Experience managing mid to large projects as Project Manager to plan and execute projects according to deadlines and budget by following Waterfall and Agile methodologies. This includes coordinating the efforts of team members and managing costs in order to deliver projects according to plan.
- Strong Project Management experience with Project Timelines, Milestones, Budget, Resource requirements and directed scope goals and deliverables.
- Direct experience in implementing enterprise data management processes, procedures, and decision support.
- Hands-on experience with data architecting, data mining, large-scale data modeling, and business requirements gathering/analysis.
- Defines data/information architecture standards, policies and procedures for the organization, structure, attributes and nomenclature of data elements, and applies accepted data content standards to technology projects.
- Experience and exposure to various business domains like Manufacturing, Services, Sales, Marketing, Banking, Finance, Retail, Supply Chain and Insurance.
- Experience providing direction and subject matter expertise in the establishment of strategic and tactical data governance, meta data management and data quality programs.
- experience in a role where data analysis, data assessment, data profiling, data transformation, data reconciliation, data improvement, data monitoring, or data presentation drove increased client satisfaction scores and higher adoption of products and services
- Solid understanding of data governance theories, principles, processes, practices, and tools, including master data, data quality, data modeling, and data stewardship.
- Designed and Develop BI solution reports in OBIEE 11g SAP Business Objects XI 3, Desktop Intelligence, Web Intelligence, Crystal Reports XI R2 and IBM Cognos 10.1, Transformer, Framework Manager etc.
- Executed data warehousing / OLTP database projects through the full software life-cycle from requirements gathering, data profile, data modeling and technical specifications to design, development and implementation.
- Hands-on ER and Dimensional Data Modeling experienced in designing OLTP database / Data Vault/ Star Schema/ Snowflake Schema methodologies in Sybase Power Designer 15 and ERwin r7.3
- Strong working experience in Informatica PowerCenter 9.1 8.6X, Repository Manager, Server Manager, Workflow Manager, Mapping Designer, Transformation Designer, and Mapplet Designer to Extract, Transform and Load the data.
- In depth exposure and understanding of DWH Concepts and ETL process using Informatica.
- Worked on transformations like Stored Procedure, Source Qualifier, Expression, Filter, Lookup, Java, Update Strategy, Joiner, Router, Aggregator transformation.
- Making recommendations for ETL design and performance enhancements for large volume data loads and processing.
- Designed ETL solutions to load Data warehouse from various source systems which was build using IBM InfoSphere DataStage 7 in order to create business centric view to help business decision process.
- Extensively worked with BTEQ, FEXP, FLOAD, MLOAD and Statistics Teradata utilities to export and load data to/from Flat files.
- Experience in Big Data and Hadoop Ecosystem tools like HDFS, Hive, Sqoop, HBase and Pig.
- Experienced in installing, configuring, and administrating Hadoop cluster of major Hadoop distributions like Cloudera.
- Have hands on experience in writing Map Reduce jobs in Java using Eclipse Builder.
SKILLS:
- O/S: Aix, Solaris, Linux, Windows, HP/UX 9.x/10.x, z/OS
- Languages: Shell Scripting, PL/SQL, C, C , Java, JSP, SAS, JCL, J2EE, HTML, XML, Perl
- Databases: DB2 V9.7, Oracle 11g, MS SQL Server 11.0, Teradata 14.0, Hadoop
- Tools: Hadoop HDFS, Map Reduce, Hive, Pig, Flume, Sqoop , DataStage 7/8, Informatica 9.1.0 8.6X, Power Exchange, Informatica Developer, Informatica Analyst, Data Analyzer, Metadata Manager, Informatica MDM, DAC, OBIEE 11g, Business Objects XIR234, AutoSys, ER/Studio V8 V9, Sybase PowerDesigner 12.0, 15.0, Erwin r7.3, Datastage Designer, Datastage Director, Datastage Manager, Datastage Administrator, Metadata Manager, Universe Designer, Web Intelligence, Autosys, Cisco Tidal Enterprise Scheduler, ESP, COGNOS.
- Other: Six Sigma, Dimensional Data Modeling, EIM, PL/SQL, Hub-Spoke, Bus-Architecture, Bill Inmon and Ralph Kimball methodologies, ER Data Modeling, MDM, SOA, EAI, MVC
- Software: DOM, SAX, SQL LIMS, LabWare LIMS, Visio 2003, Siebel DWH, QualityCenter, NDM/Connect Direct, WebLogic, Java, Teradata v2R5/v2R6, Crontab, Korn shell, XML, XSD, ICDs, Data Flow Diagrams, DMAIC, Web Services, SOA, Unix, Linux, Data Cleansing, Clarity, Share point, Harvest, J2EE, FLOAD, MLOAD, TPUMP, TOGAF, ZACHMAN, RUP, Semantic web, OMG, Jsp, Ajax, Spring 3.0, Strut, Hibernate 3.0, mobile.
PROFESSIONAL EXPERINCE:
Confidential
ETL Architect / IT Lead
- Provided architectural and design leadership across several cross-functional development teams, as well as work closely with other architects and project management.
- Responsible for Manage core banking hub resources and application delivery, driving process efficiency, implementing systems and key initiatives.
- Overseen activities for Core Banking Hub project from planning to delivery and sustaining support to ensure that projects meet established business objectives, time and budget constraints.
- Co-ordinated with each interface team to review the interface architecture and tasks list to ensure the deliverables produced in timely manner.
- Provided leadership and coordination of all technical aspects of systems development activities required to provide a productive and efficient system that meets business requirements.
- Responsible for defining and implementing strategy for Trade finance data integration architecture, business intelligence, reference data, and data quality solutions for T360 data.
- Offered strong combination of business and technical expertise, ensuring steadfast project management from development of initial concept through design and implementation of ETL solutions.
- Assessment of existing platform architecture frameworks, evaluate and select software and hardware technologies.
- Provide rationalization of ODS environment through business and technical analysis to efficiently organize, optimize, cleanse and structure data to meet internal and external business needs
- Worked with Business Analysts to define conceptual and logical models of Core Banking Hub database design.
- Leading the Informatica Development Efforts to build ETL process for loading Trade Financial data from T360 system to Core Banking Hub database.
- Designed ETL architecture solution for Core Banking hub Operation Data Store requirements to build ETL solution.
- Analyzed the source data to determining accuracy and completeness of the source data which clarified the structure, relationship, content and derivation rules of data.
- Designed the Core Banking hub database by following the Star Schema methodology of building the data warehouse.
- Architecture design of downstream loads to feed various downstream systems from Core Banking Hub.
- Used Architecture Patterns to reuse the integration solution for the specific problems of the data integration requirements.
.
Environment: OVS, GPP, AML, HotScan, SWIFT Messaging, Doc Exam, Oracle GL, FAH, DB2, Mainframe, IMMS, ACBS, T360, Trade Finance, Loan, EXIM, Informatica 9.1, Oracle 11g, SAP Business Object XI R2, Microsoft Project, Embarcadero E/R Studio 8.0, Putty, AIX-Unix, Microsoft Share point, Tidal, Shell Scripts, Quest Toad, PL/SQL, Data Profile, Power Exchange, Connect Direct, SFTP, MQ Broker, T360 Trade Processing System, Mete data Manager, Data Lineage, Audit Reports, Pushdown Optimization, Change Data Capture, XML, XSD, XML Generator, XML Midstream, XML Parser, MQ Middleware, DR testing.
Confidential
ETL Architect
- Worked with Enterprise Architecture Team to find the gaps in Actimize system from approach that meet overall enterprise architecture requirements.
- Leading the technical architecture and design for a quality assessment of new Data Warehouse.
- Leading and mentor the efforts of data quality completeness initiative for Informatica based ETL architecture.
- Overseen activities for all tasks in Project from planning to delivery and sustaining support to ensure that projects meet established business objectives, time and budget constraints.
- Performed deep assessment of the Actimize system in order to find whether ETL best practices were followed or not and submitted the recommendations of improvement.
- Designing architecture solutions for Hornet Data warehouse requirements to develop ETL solution.
- Initiated Design of ETL architecture solution from already existing ETL processes which are required to be analyzed as a part of the quality assessments.
- The Process flow diagrams have been reversed from reviewing the ETL code in order to perform data quality measures.
- Evaluated current data warehouse/ETL environment for usage in spend analytics solution deployment and provide gap assessment.
- Reverse Engineered ETL Database into Physical Data Models using ER/ Studio Tool.
- Source to Target Mappings based on reviewing the exiting code and data analysis.
- Created Data Flow Diagrams for ETL processes scheduled to run daily basis.
- Performed Data Quality assessment to know the hard coded values and records rejected and reference data.
- Developed Data Lineage Reports in Metadata Manager along with performing the meta data profiling on actimize data.
- Generated Data profile reports using Data explore in Informatic 8.6.1. In order to undrstand source and target data which is processed thru ETL processes for Actimize.
- Performance tuning heavy queries and optimizing Informatica PowerCenter workflows and Stored procedures.
- Responsibile to work with developers to assure design and implementation conforms to internal standards, makes use of best practices and performs well.
Environment: Informatica 9.1, Informatica 8.6.1, Oracle 11g, 10g, ER, SAP Business Object XI R2, Microsoft Project, Embarcadero E/R Studio 8.0, Putty, Linux, MS SQL Server 2008 R2, Autosys Jobs scheduling, Informatica7.1, Metadata Manager, Data Explorer 8.6, Data Quality8.6, Shell Scripts, Quest Toad, PL/SQL, Mantas, Actimize, CadBatch, Hornet, CBW, Watch List, Restricted List, Erika Engine, Alerts, Java, Data Profile, Power Exchange.
Confidential
Data Architect
- Designed architecture solutions for Investment' BI reporting data requirements using data warehousing concepts
- Designed Asset Class Business Unit Hierarchical Data Model using Erwin 7.1 tool.
- Logical Physical Data modeling for derivative elements need to be sourced from MUREX data system.
- Filenet and ETL integration Data Modeling to capture reports transfer activities.
- Designed Filenet ETL integration Architecture in order to ftp OBI reports from OBI to Filenet through ETL process.
- Designed and implemented MQFTE Autosys jobs for Filenet ETL integration process.
- Designed Datastage parallel jobs to generate and parse xml documents in order to share metadata information between Filenet ETL processes.
- Setup OBI reporting system to schedule reports through IBOTs in order to deliver them in pdf format on Datastage server so that they can be MQFTE to Filenet for further processing.
Environment: Architectural Patterns Public Subscribe, RESFful API, Biztalk, MSMQ,nServiceBus, IBM Datstage 7, Oracle 11g, Erwin 7.1, OBIEE 11, Microsoft Project, Clarity, Putty, Linux, Microsoft Share point, Oracle SQL Developer, Autosys Jobs scheduling, Mobius, MQFTE, Star Team, Filenet, NAS Storage mount
Confidential
Data Architect
- Designed complex data integration solutions in an inclusive and participatory manner to negotiate and with and influence other design parties to reconcile technical and business considerations in order to arrive at the optimal solution of the requirements.
- Logical Physical Data modeling for SQL LIMS and LABWARE systems data requirements
- Managed complexity of data integration environment in both technical and business sense understanding data and process flow thru data integration layer and up/down stream systems dependencies.
- Source to target mappings with business rules.
- Data Analysis Profiling on LIMS and SAP data to understand the relationships and data quality issues.
- Defined and implemented DWH ETL Architecture using Kimball Methodology for data requirements.
- BO universe design using PGS DWH dimensions and facts for reporting requirements.
- Translated business needs into technical solutions, designed, developed, documented overall architecture of systems, Lead the architecture solutions that meet performance, usability, scalability, reliability, and security needs.
Environment: Data Fabrication, Data Virtualization, Informatica 9.01, Web services, HA grid, Oracle 10g, Oracle E-Business Suite Oracle EBS 12.1 , Erwin 7.1, SAP, OLAP technologies, Business Objects XI 3.0, Microsoft Project, SQL LIMS, LABWARE LIMS, Share Point, Universe Designer, Web Intelligence, Oracle Fusion Applications, Ralph Kimball Methodology.
Confidential
ETL Architect
- Logical and Physical dimensional modeling for PNCI DWH using Erwin Data modeler tool 7.3 to have data warehouse tables structure created so that ETL can be implemented.
- Analyzed data from a variety of data sources to present a cohesive view of the data and completed data profiling using IDE on various source systems to understand the data.
- Lead Data Design ETL efforts to build data warehouse including writing design specifications and finalize DWH ETL Architecture.
- Worked closely with Informatica development team on ETL design and build to ensure the data solution provided meets business expectation and any technical difficulties are resolved on time.
- Designed three layers ETL architecture for PNCI data warehouse where middle layer represents unified layer which store operation data for PNCI reporting and downstream applications.
- Established new extract feeds using Power exchange Tool to integrate source data with Informatica processing for DWH loads.
- Created, review, and provide recommendations on technical procedures, architecture documentation, and engineering decisions.
Environment: Informatica 8.6x, Power Exchange, Oracle 10g, Erwin 7.3, Shell Script, AIX, Main Frame, OLAP technologies, data migration strategies, Trillium version 12, Harvest Software Change Manager, CA clarity, OBIEE 11g, NDM/Connect Direct, Oracle E-Business Suite Oracle EBS 12.1 , Oracle Fusion Applications, MS Project.
Confidential
ETL/Data Architect / Project Manager
- Responsible for the overall design of the data/information architecture, which maps to the enterprise architecture and balances the need for access against security and performance requirements.
- Logical and Physical Data model design enterprise wide using Sybase power designer to have database structure in place for Orders Products and Dealer Inventory Tracking data to start web interface and ETL implementation.
- Managed program/project roadmap and key milestone regarding criticality, downstream impact if dates are missed and determine alternative/mitigating actions.
- Develop cross-functional project plans and manages execution of tasks and completion of all deliverables within the business process and technical areas.
- Developed data/metadata models and technical specifications using standard modeling techniques.
- Analyzed the business requirements to design, architect, develop and implement highly efficient, highly scalable ETL processes for Inventory data.
- Analyzed data from a variety of data sources to present a cohesive view of the data and completed data profiling using IDE on various source systems to understand the data.
- Worked with business and technical partners, perform data analysis and data profiling, identify data quality issues, develop data model, ETL and report specifications, document enterprise data repositories, and repository relationships.
- Prepare, maintain and publish ETL documentation, including source-to-target mappings and business-driven transformation rules.
- Designed and developed the Operational Data Store for EIM DIT report requirements that will help business users in business decision process.
- Implemented a Data Quality Solutions, which includes Standardization and Matching of Source data using IDQ.
- Actively designed PIN Expansion data model to achieve Interoperability between systems utilizing different PIN formats.
- Performed architect, design, and develop enterprise data warehouse components including databases and ETL using Informatica.
- Designed and implemented product data vault to extract, transform and load the product data with history of changes on the products attributes.
- Supported the project teams during the user acceptance testing activities and assists in development of post deployment data maintenance plan
- Developed the Data Integration Specifications ETL and Web including data cleansing and data transformation rules.
- Defines data/information architecture standards, policies and procedures for the organization, structure, attributes and nomenclature of data elements, and applies accepted data content standards to technology projects.
- Identified data quality issues by analyzing data in various systems and working with business users in order to implement data cleansing processes.
Environment: Integration Method ESB, Webservice, API, ODBC, MOM,Informatica7.1, IDE, IDQ, DB2 Host, AIX, Shell Script, QMF Windows8.1, WinSQL3.5, Microsoft Visio2003, ESP, JCL, SAS, Oracle 9i, Business Objects, Universe Designer, Web Intelligence, database performance tuning, data migration strategies,SQLServer.2005, SSIS, Trillium version 10, XML, XLST, XPath and XQuery, DOM and SAX, ClearCase, Share Point, Quality Center, SQLServer.2008, SSIS, Bill Inmon Methodology.
Confidential
ETL Lead / Assistance Project Manager
- Analyzed database issues and provide recommendations as required to support the Development and System Administration teams in the design, development, testing, and tuning of the supporting applications, queries, and or, the relational data base.
- Worked with business users to document Data Migration requirements and criteria, define Master Data and functionality requirements, including data acquisition, quality, approvals and distribution.
- Worked on Architect, design, and develop operational databases required for BI requirements.
- Design and documented the ETL processes that have used various transformations like Aggregator stage, Transformer stage, Sort stage, Complex Flat File stage, DB2 stage for DMT Organization data.
- Provided user with best ETL solutions for project which has followed best practice of Datastage development.
- Datastage ETL design and Code review that ensured smooth development and deployment.
- Documented defects and coordinate fixes with offshore Technical Lead.
- Worked as data modeler to make sure right database is in place to begin development with.
- Developed and maintain a data dictionary that reflects the current database design.
- Worked with business, functional and technical teams to coordinate deployment activities, constantly monitoring actual vs plan and highlighting anticipated issues or slippage and formulating recovery plans or assessing impact if not recoverable.
- Architecting new security model as per the SOX Regulations and managed the Project/Delivery.
- Facilitating broad adoption of consistent Security measures, provided high level information assurance, availability and essential Security to maintain a reliable system and consumer confidence.
- Worked directly with the Functional SMEs, BSAs, and development team to troubleshoot issues, resolve defects and implement system enhancements
- Accountable for program BI/BW solution including Oracle BI EE and Business Explorer BEx design, integration, development and oversight Identifies, reports and guarantees mitigation of risks or issues.
Environment: Analytics784/75, Trillium version 10, Siebel DWH, Informatica7.1, Oracle 9i, Tomcat Web Server, HP-UX, DAC Server, Shell script, MS Project, Ascential Datastage7.5, Tomcat Web Server, Shell Script, Siebel Analytics784, ERwin r7.1
Confidential
Tech Lead
Environment: Teradata v2R5/v2R6, Oracle9i, Informatica PowerCenter / PowerMart7.X/ 8.0 Maestro, Business Objects, Erwin 4.1.4, PL/SQL, SQL LOADER, Toad, UNIX, Solaris Windows-NT, putty
Confidential
ETL Designer
Environment: Cognos- Impromptu 6.0, Power Play 6.6, PPES 6.6, Informatica Power Center v5.1, Erwin 3.5.2, Oracle 8i on Sun Solaris and Windows 2000
Confidential
Lead Developer
Environment: EJB, JAVA, JDK, Java Script, JSP, JDBC, Oracle 8i, Weblogic Server 5.0, Sun Solaris, JDeveloper3.2.3, Team Site4.2.1, Rational Rose2000, Toad, ErWin3.5,PL/SQL
