We provide IT Staff Augmentation Services!

Big Data Solutions Architect Resume

0/5 (Submit Your Rating)

SUMMARY:

  • ImplementedArchitecturalSolutions on large - scale Enterprise Information (MDM, BI, Data Integration, Analytics, and Enterprise Integration) o Established best practices through published papers and on-site consultation o Field experience with key Data Quality solutions: Entity de-duplication, Data Cleansing, and Data Profiling o
  • Consulted on Information Architecture solutions in Banking, Insurance, Retail, Government, Utilities, Media and Telecommunication industries. o Developed and Lead Performance Proof-Of-Concepts and Benchmarks for client pre-sales scenarios with multiple product lines o Published papers and created curriculum for
  • Performance Best Practices o Collaborated and evaluated on critical new product design and strategic product integrations o Developed prototype project plans, service cost models etc o Bridged implementation experience with presales to accelerate the closing of sales Multicultural experience through work experience

TECHNICAL SKILLS:

Big Data: Hadoop Distributed File System (HDFS), MapReduce, Hive, Pig, Sqoop, Hbase, Flume, Oozie, Mahout. Has hands-on experience on installation, configuration and administration of hadoop distributed by Apache, Cloudera, IBM and Hortonworks.

ETL, Data Quality & Analysis: Informatica PowerCenter, Ascential DataStage, Ascential QualityStage, Profile Stage, IBM Information Server, IBM Information Analyser, Talend. Has hands-on experience on installation, configuration and administration of IBM Infosphere products.

Database & Data Modeling: Oracle, Teradata, DB2 UDB, Db2/400, IMS, Sybase, MS SQLServer, Informix, SQLite, Erwin and IBM Rational.

BI Reporting: Cognos, Business Object, Micro strategy, M-text, Crystal report.

Mobile: Android SDK, iOS SDK, Cross platform development tools.

O/S: AIX, SUN, HP-UX, LINUX, Windows, MVS / ZOS, AS/400, Android, iOS.

Cloud computing: Amazon EC2, HP Cloud

Scheduling: Control-M, Autosys, ESP

PROFESSIONAL EXPERIENCE:

Confidential

Environment: CDH4, Linux,Unix, Hadoop Distributed File System (HDFS), MapReduce, Hive, Pig, Sqoop, Hbase, Flume, Splunk, Oozie, Mahout, Talend, Teradata, Oracle, IBM Information Server 8.5/ Datastge, Autosys.

Big Data Solutions Architect

Responsibilities:

  • Performs key leadership role in the areas of Big Data implementation, data integration and database design
  • Installed and configured Big Data, Talend and integrated Hadoop with Teradata.
  • Defined data architecture and Integration methodology (EAI, ETL, and ELT) and provides input to strategy roadmap for enterprise data modelling, implementation and data management for Big Data systems. This includes setting vision, gathering requirements, gaining business consensus, performing vendor and product evaluations, mentoring business and development resources, deliver solutions, and documentation.
  • Provides standards and guidelines for the design & development, tuning, deployment and maintenance of Big Data models and physical databases.
  • Designs and creates Big Data architectures and collaborates with peers to translate the solution into logical and physical data models.
  • Designed and defined ETL Framework (Data quality, audit, Data functionalities) on Big Data platform.
  • Provides overview for logical and physical data model development across applications teams
  • Assists in the development of policies around data access, data retention, data security and data stewardship.
  • Actively participates in enterprise-level information technology roadmap planning and information technology governance activities.
  • Participates in product roadmap discussions and identify key areas for improvement in the product and in corporate these goals into ongoing & future development initiatives.
  • Explores and considers future and non-standard database technologies with the desire to push the edge on the technology stack.
  • Leads innovation by exploring, investigating, recommending, benchmarking and implementing Big Data technologies

Confidential

Environment: Teradata, Oracle, Sybase, Informatica 8.6, IBM Information Server 8.5/ Datastge, MVS, ETI, Metadata, ESP, Unix, AMJ.

SolutionArchitect

Responsibilities:

  • Solution Design, Database Design, Development, Impact Analysis and Testing.
  • Created High level design document (HLDD) and Low level design Document (LLDD) showing the architecture diagram with pointers to detailed feature specifications of smaller pieces of the design.
  • Defining the DATA architecture and Integration methodology (EAI, ETL, and ELT).
  • Developed and maintain a Review & Quality checklist document to ensure the submission of design documentation is compliant with an agreed process for review by Westpac’s architectural and technical teams.
  • Translated architectural documents and worked with Technology and Infrastructure (T&I) team to deliver the System test, System Integration test, end-to-end test and UAT environments for functional and non-functional capabilities

Confidential

Environment: Oracle 10g, XML, JAVA, Web Services, IBM InfoSphere Information Server 8.1/Datastage.

Solution Architect

Responsibilities:

  • Development and maintainedintegrationtechnical standards, design patterns, architectures and roadmap for AAA data integration.
  • Defined SOA solution (for data consumption) and interface standards between DWH and TripleA.
  • Designed, defined and implemented ETL Framework for extracting data from different source systems of Macquarie and load them to DWH and do real time/batch integration with adviser desktop system called TripleA.
  • Worked as a SME (subject matter expert) in providing technical advice to project (Designers, Developers and Testers) team and delivering the solution as defined in agreed architecture
  • Collaborated with multiple Business Technology teams and Architects to provideintegrationsolutions that meet business capabilities and are consistent with enterprise standards, architectures and strategies.
  • Guided the implementation and testing (SIT - system integration and UAT - user acceptance) work streams, in delivering solution with agreed business requirements.

Confidential

Environment: Informatica Power Center, Java, JavaScript, Erwin, MVS, TSO, ISPF, Oracle, SQL Server, DB2/UDB.

DWH & BI Solution Designer

Responsibilities:

  • Providing architecture for a flexible auditing and tracking system that can be easily hooked on any existing risk system.
  • Designing an ETL solution using ETL tool Informatica Power Center.
  • Translating business requirements into technical design.
  • Providing performance improvement recommendations.
  • Creating blue print and proof of concept jobs and template.
  • Providing consultation to ANZ development team in India.
  • Providing a common DB design for the auditing and tracking system.
  • Providing automated solution for performance improvement.

Confidential

Environment: HP-UX 8, Solaris 10 Grid/ Clustered, Oracle 10g, SQL Server 2003, DataStage, Qualitystage, Profilestage and Erwin.

Solution Architect

Responsibilities:

  • Installation configuration and administration of Datastage, Qualitystage, Profilestage and Siebel environment.
  • Produce capacity planning document and provide hardware and software architecture.
  • Create blueprint and solution document for data migration and data quality approach for Origin.
  • Provide guidance to project team for financial data migration and resolution of data quality issues.
  • Provide guidance to project team to ensure migration process meets the SLA migration batch window.
  • Using Datastage ETL tool integrate data from different billing systems.
  • Create high level ETL design documentation
  • Participated and provided input for MDM implementation.
  • Translate business requirements into technical design.
  • Integrate Siebel EIM process with Datastage and create utility to initiate Siebel EIM process from Datastage.
  • Create blue print and proof of concept jobs and template for Siebel integration.
  • Extract data from OS/390 files, Oracle and SQL Server into Siebel using DATASTAGE, Shell and Perl Scripts.
  • Using DATASTAGE created various Transformations like Joiner, Aggregate, Funnel, Change Capture, Filter, and Update Strategy.
  • Create complex transformation utility using custom stage.
  • Provide architecture to integrate Siebel to non-Siebel system and transfer data on a daily basis.

Confidential

Environment: Erwin, Teradata, SAS and Oracle.

Data Architect / Metadata Architect

Responsibilities:

  • Created Conceptual Data Model (CDM) based on the Project Charter and Business Capabilities Documents.
  • Produced Logical Data Model (LDM) based on business requirements documents.
  • Participated and provided input for MDM implementation.
  • Facilitated workshop sessions.
  • Participated and/or led Data gathering and design sessions to identify the data requirements and maintained a log of open and closed issues.
  • Created Data Mapping Document, based on the business requirements.
  • Conducted assessments, produced designs, and delivered implementations fordata protection, archiving, anddatastorage solutions
  • Collaborated with peers, Business analysts, System analyst, Developers, Project managers, Lead Logical Data Architects and Lead Physical Data Architects during of the design and analysis phase of the projects.
  • Participate and/or lead the Inspections (internal and external), data analysis, and data discovery sessions.
  • Worked with physical Data Architecture staff to transform logical data models into physical database designs.
  • Published the models and other documentations.
  • Produced series of Erwin reports and create a data dictionary.
  • Reviewed and validated developed blueprints and art effects, their foundations and implications with the stakeholders

Confidential

Environment: Red hat and Suse Linux, Solaris Grid/ Clustered, Oracle 10g, DB2, Teradata, SAS, Datastage Enterprise, Erwin.

ETL Solution Architect

Responsibilities:

  • Requirement gathering.
  • Translated business requirements into technical design.
  • Designed and maintained logical and physical enterprise data warehouse schemas using ER-Win.
  • Extracted data from OS/390 files, Oracle, Teradata, SQL Server and SAS using DATASTAGE and Linux Shell Scripts and SAS 9.1.
  • Using DATASTAGE created various Transformations like Joiner, Aggregate, Funnel, Filter, and Update Strategy.
  • Created migration and metadata documents for users.
  • Performed Datastage Administration tasks and configuration.
  • Conducted Parallel testing between Datastage jobs and existing mainframe JCL Jobs.

Confidential

Environment: Win 2K Server/Oracle 9i, DB2, SQL Server2000, Informatica Power Center, Business Object, Erwin.

Senior Data Warehouse Consultant and Data Architect

Responsibilities:

  • Requirement gathering.
  • Translated business requirements into technical design.
  • Designed and maintained logical and physical enterprise data warehouse schemas using ER-Win.
  • Extracted data from OS/390 VSAM files, Oracle, SQL Server, Sybase and DB2 using Informatica Power Center and languages like C, COBOL, PL/1 and Unix Shell Scripts.
  • Using Informatica Power Center created various Transformations like Joiner, Aggregate, Filter, and Update Strategy.
  • Created migration and metadata documents for users.
  • Used Business Objects Designer, Reporter and Supervisor.
  • Resolved Cardinality, complex loop problems with context and alias methods.
  • Enhanced and maintained reporting applications.

Confidential

Environment: DB2 UDB, SAP (FICO), VSAM, Oracle 9i, Unix Shell Scripts, PL/SQL scripts, Erwin, Datastage, Cognos, DB2/400.

SAP Integration Consultant

Responsibilities:

  • Implementation of SAP (FICO) for the clients.
  • Requirement gathering for customization.
  • Project Planning & Specification Preparation.
  • Query optimization,
  • Documentation system administration procedures and policies.
  • Evaluating hardware and software performance.
  • Integrating SAP (FICO) with other non-SAP systems.
  • Analyzed the sources, targets, transformed the data, mapped the data and loading the data into targets using DataStage.
  • Developed several Cubes and extracted simple and complex reports using COGNOS.

We'd love your feedback!