We provide IT Staff Augmentation Services!

Lead/data Architect Resume

2.00/5 (Submit Your Rating)

SUMMARY:

  • Highly motivated, confident Lead/Architect with exceptional multi - tasking, communication and organizational skills possessing strong business acumen for client data needs.
  • Extensive experience of identifying the needs of delivering data services for key external and internal business. Possessing a significant record of in data analysis, data profiling, data management, data integration, master data management, performance engineering and infrastructure design and able to quickly understand the mission, vision and values of an organization.
  • As a Lead/Architect, managed overall strategy, architecture and implementation of various storage, database, data integration, data security, migration, replication and data services projects. Created and implemented IT business objectives, SLAs and KPIs. Managed day-to-day operations of database, data warehouse, infrastructure designs and data governance.
  • Guided resources to design, maintain, develop and support high available solutions with Oracle, Teradata, NoSQL/BigData & Hadoop Eco Systems.
  • Worked as a Technical Lead, Data/Database/DW Architect in the IT industry for more than 7 years with over 11 years of experience on Oracle/Teradata and Big data technologies with all phases of life cycle from analysis, architecture, design, development, integration, security, implementation of various database technologies and applications running on these technologies.

TECHNICAL SKILLS:

Operating Systems: Sun Solaris, HP-UX, IBM AIX, RedHat Linux, Ubuntu, Windows and Mainframes

Databases: Oracle, Teradata, MySQL, DB2, MS SQL Server

Applications/Tools: Toad, SQL Navigator, Eclipse, Oracle Golden Gate, IBM Rational Clear Case, IBM Rational Clear Quest, Teradata utilities, Tortoise GIT, Jira 5.0.x, UC4, Metavante, Metavante Starview, SAS, CVS, VSS, Magic Draw UML

Data Warehousing: Ab Initio, DataStage, Informatica, Business Objects, Microstrategy, Crystal Reports, Oracle reports, UML

Big Data: Hadoop, HDFS, Map Reduce, Pig, Hive, NoSQL, Cloudera distribution, Horton Works, Cassandra, MongoDB, HBase, Sqoop, Flume, Oozie, Zookeper

Modeling: Visio, Erwin, SQL Developer

Scripting/Programming: Shell, SQL, Hive-QL, Impala, PL/SQL, Bash, Ksh, Perl, Python, Core Java C, C++, JCL, COBOL, Pro*C

PROFESSIONAL EXPERIENCE:

Confidential

Lead/Data Architect

Responsibilities:

  • Participated in monthly meetings with senior management to discuss technical infrastructure and market recommendations assisting in data profiling and analysis, architecture design and data storage and access.
  • Worked with business and technical expertise, ensuring steadfast data management from understanding the enterprise level data; providing them guidance in overall data strategy using NoSQL approach as well as traditional, relational approach depending on various business needs
  • Developed and maintained a formal description of the data and data structures which included data definitions, data modeling (logical and physical), data flow diagrams etc., and included topics such as metadata management, business semantics, and defining data layers for local and enterprise level reporting capabilities.
  • Worked with ETL Development and Support teams with tuning large Load & Extract process to improve performance.
  • Design and implemented ETL processes to load/unload data sets from NoSQL/Hadoop cluster to RDBMS/Data Warehouse stores.
  • Worked with database and system administrators improving service level agreements by reducing time to deliver IT solutions to business by better data governance.
  • Developed stronger ties between customers and Infrastructure team resulting in collaborating and working together to effectively use IT for business in terms of storage and access of data for better data governance.
  • Worked with Business analysts, Site leads and other IT managers in understanding the technical solution of the data to be migrated from legacy to the new systems making sure the functionality and the business processes remain intact after the data migration.
  • Developed an overall system architecture diagrams which can be used as a primary integration solution diagram to get a high level understanding of the business segments involved.
  • Analyzed and prepared job dependency diagrams and the processes for enterprise systems which will help in the smooth transition of data into the production environments.
  • Analyzed technical solution documents and functional requirements documents and the overall architecture documents.
  • Involved in Analyzing, Designing, Configuring Hadoop clusters. Worked with customer data team with installation, configuration and administration of 40 node HDFS cluster including Hadoop eco-system (Pig, Hive, Sqoop, Flume, Hbase) in RedHat Linux environment.
  • Provided and architected storage and Linux systems for databases, HDFS and Big Data environments using Hadoop/HDFS, HBase, Pig, Hive, Map/Reduce and NoSQL (MongoDB & Cassandra).
  • Designed, build and delivered a large-scale Hadoop cluster. Fine tuned compute, network and storage layers in the HDFS cluster to achieve high performance and scale out architecture and to host very large data sets.

Confidential

Sr Database Consultant/Architect

Responsibilities:

  • Offered strong combination of business and technical expertise, ensuring steadfast project management from development of initial concept through design and implementation of ETL solutions.
  • Provided architectural and design leadership across several cross-functional development teams, as well as work closely with otherarchitectsand project management.
  • Responsible for all phases of Project Management for accomplishing multiple deliveries providing environment through business and technical analysis to efficiently organize, optimize, cleanse and structure data to meet internal and external business needs.
  • Worked on requirement gathering/analysis with the client to bring information of different geographical regions under the single point of Data warehouse.
  • Analyzed the source data to determine accuracy and completeness of the source data which clarified the structure, relationship, content and derivation rules of data.
  • Worked with DBAs in preparing DDL/DML/DCL/TCL scripts for smooth transition of code to other environments and production.
  • Created new UC4 jobs and modified existing jobs to accommodate enhancements made in the processes making sure the overall job dependency stays intact.

Confidential

Sr Data/Database Architect

Responsibilities:

  • Worked on Data Strategy enumerating the Data Policies each of which commit the organization to codifying a best practice. The policy focuses on area of data standards; data retention ordata stewardship ; data security orInformation Assurance .
  • Prepared Data dictionary and Master Data Management strategy that provide data governance, standards that consistently manage critical data of the enterprise classifying the data into different logical categories.
  • Managed both master, d data as well as analytical data standardizing the data to ensure consistency and control in the on-going maintenance and predictive analysis.
  • Worked with DBAs providing guidance on normalizing the data, providing solutions on indexing which help in speedy access to data for business critical decisions.
  • Work with Business analysts, understand business requirements, and prepare Functional design documents, technical design briefs and finally Source-to-target mapping documents for smooth transfer of data from source to target.
  • Work with SMEs in understanding the source data and taking care predominant issues in the source system which will help in designing the solution.
  • Conducted review meetings to review the designed solution, improve the process, making sure the process is up to the pre-determined standards and guidelines for smooth migration of code into all the development, testing, production and disaster recovery environments.
  • Worked on complex queries, Packages, Procedures, Functions and Triggers for all functionalities of the application.
  • Performance tuned long running PL/SQL packages making use of collections or other Oracle performance tune techniques where necessary to implement business rules.
  • Work with the developers, constantly guiding them to follow the specified coding standards and help them tune various programs.
  • Configured and developed Map Reduce programs, Hive UDFs in Java and HIVE and SQL queries for the analysts to meet business requirements
  • Worked onHadoopcluster running MapReduce Jobs using Sqoop to ingestdatafrom various source systems into HDFS.
  • Maintained System integrity of all sub-components related to Hadoop (primarily HDFS, MapReduce, HBase, and Hive).

Confidential

Senior Database/Application Consultant

Responsibilities:

  • Participated in meetings with the clients (end users), gathered business requirements and involved in documenting functional specs and helped establishing deliverables paying continuous attention to technical excellence and good design.
  • Created guidelines on naming conventions, standards and processes, educating the enterprise teams in following the set standard guidelines throughout the project development.
  • Participated in all phases of SDLC from requirements gathering, design, development, testing, implementation and support across systems.
  • Worked on finalizing Functional requirements in collaboration with client, design the system by transforming the Functional requirement into Technical Specifications using the set guidelines.
  • Followed agile methodology for quick delivery of working software, focusing on quick responses to change and continuous development.
  • Worked with co-developers and client managers for secured life cycle from development through testing and implementation to production.
  • Programmed functionality for all the components in the user interface interacting with the database using Enterprise Java Beans and developed various controller classes and business logic using the spring libraries which interact with the middle tier to perform the business operations.
  • Monitored, performance tuned SQL, PL/SQL programs by necessarily partitioning the tables, understanding data to create right indexes to reduce the runtime and increase efficiency, using Explain Plan and SQL Trace.
  • Developed the job dependency diagrams for batch processing successfully implementing the external system dependencies.
  • Worked on Java programs, Unix scripts and UC4 scheduling to run fine-tuned PL/SQL and SQL programs
  • Supported production and production parallel systems, actively resolving production issues by meeting the required SLAs.

Confidential

ETL Architect/Lead Consultant

Responsibilities:

  • Worked with end users to translate business-reporting requirements into data warehouse architectural designs, analyzing source and target data models, making necessary enhancements.
  • Worked with Site Leads in understanding the source data, preparing logical data model and physical data models to help understand the source data and target structures for good transition of data between systems.
  • ImplementedETLsolution for 34 different source systems which provided the feeds to a Next Generation Centralized Enterprise Data warehouse to create effective Employee Information.
  • Worked as the single of point of contact of client engagement in the domain of AbInitio, Oracle andETLissues for the team.
  • Worked as an Expert Designer and developer in AbInitio and mentored the Team with the development modules.
  • Overseen activities for all tasks in Project from planning to delivery and sustaining support to ensure that projects meet established business objectives, time and budget constraints.
  • Worked as a Designer/Developer/SME forETLImplementation for Master Customer data Management.
  • Generated, tested and deployed Korn shell scripts for detecting data files arriving continuously and populating warehouse and hence providing continuous updates to the warehouse.
  • Developed wrapper scripts to execute Ab Initio graphs and PL/SQL packages followed by error handling routines and post audit checks/data reconciliation after the code execution while using signal file approach to trigger processes.
  • Created generic batch process to direct load into staging tables and then an insert/select into the main base tables to improve performance.
  • Responsible for creating (MFS) Multifile, which gives ability to centrally control the distributed data, files and they provide the scalability and kinds of access patterns and parallel application technique enhancing the performance of application graphs using partition components and Ab Initio MFS techniques and PL/SQL.
  • Provided day-to-day primary and secondary production support, worked on various data and batch process related issues.

Confidential

Sr Database Consultant/Architect

Responsibilities:

  • Responsible for developing various dimensional and fact load PL/SQL processes.
  • Identified issues related to various fact/dimension processing, table structures, modified the requirements and developed/modified code accordingly.
  • Developed custom scheduling tool using Linux shell scripting and crontab to build complicated batch process orchestration.
  • Developed a generalized framework for batch process orchestration using Shell/AWK/SED programming which can be used by various projects within the organization.
  • Developed Job dependency diagrams, Conceptual, Logical and Physical Data Architecture diagrams - Kimball methodology, Data dictionary, Metadata.
  • Documented and designed the entire batch process orchestration using Microsoft Visio and Microsoft word and created the batch process run book to be used by the support team
  • Developed various shell scripts and shell wrappers in Linux environment to successfully run various PL/SQL packages
  • Developed error handling/ logging in both Oracle and UNIX environments to identify any issues in the batch processing, notifying the support team by sending automated emails.
  • Developed views to be used by Cognos reporting to identify the overall batch process time, breaking up the time into each individual processes, thus producing a detailed report of the entire batch process to be analyzed by the support team in understanding the time taken by each individual process.
  • Worked with the data source team helping them in developing the data acquisition processes in conjunction with oracle, developing various functions/procedures in oracle to help in the successful data acquisition and logging.
  • Worked with business users, helped them with any issues related to the data or data processing working aggressively with them in resolving the issues and providing solutions.
  • Developed test cases and test plans to test each individual process fixing the issues and developing the code using modular approach.
  • Set up the Linux environment and designed the Linux structure for the Financial Data Mart project, which helps in maintaining the code and logs in various directories thus helping in analyzing the issues and fixing them quickly.
  • Provided 24/7 primary production support as well as Secondary support

Confidential

Senior ETL consultant/Architect

Responsibilities:

  • Worked with SMEs of various systems feeding the data warehouse conducting JAD sessions with the teams and prepared detailed design specifications and technical design specifications documents
  • Developed multiple PL/SQL programs, stored procedures, triggers, modular AbInitio graphsfor data loading and data validations and documented a plan for data management and data integrity checks.
  • Developed routines using SQL Plus, Pro*C to extract data from Oracle systems to Flat files.
  • Using distributed-query technology, developed procedures to directly query tables located in various different source systems.
  • Created tables, views, snapshots, and special feature function as per requirement using procedures, functions and triggers for implementing complex database intensive business rules.
  • Developed batch process to transfer flat files from centralized locations to specific directories and execute SQL*Loader and PL/SQL routines to load the data into staging tables and then do an DB insert into to production tables to improve performance of the batch.
  • Created various processes in ETL environment to integrate data from various sources, applying standardized rules, creating temporary tables, external tables for smooth and efficient transition of data from source to target tables.
  • Utilized table functions to support pipelined and parallel execution of transformations implemented in PL/SQL.
  • Tuned SQL queries and AbInitio Graphs using various techniques like implementing parallelism in the processes to improve performance, worked with DBAs to resolve various performance related issues and maintain data integrity.
  • Created marts, star schemas and snowflakes for OLAP process; Kimball methodology used for marts. Inmon methodology used on warehouse.
  • Identified and separated Logical data errors using Oracle MERGE function. Data rule violations were handled both in PL/SQL and in the database using the error logging table.

Confidential

Senior Database Consultant

Responsibilities:

  • Identify the data required to be interfaced into the solution to support the documented design, file layout for the data and the batch processing schedule required to import data into the solution to meet required system availability and batch windows.
  • Endeavored to take ownership and support of batch processes, which will support the planning solution.
  • Acted as data futurist (i.e., visionary of future data uses) and catalyst for converting data into knowledge management.
  • Developed PL/SQL/BTEQ/FastLoad/Multiload/T-pump scripts to transform complicated business logic followed by error handling routines and post audit checks/data reconciliation after the execution
  • Automated Monthly, Weekly and Daily refresh (data unload/load) using the B-TEQ scripts and also SQL/PL-SQL scripts embedded in UNIX shell/Perl scripts.
  • Developed SQL*Loader scripts to load data from flat files into the date warehouse. Also utilized Oracle External tables to enable the pipelining of the loading phase with the transformation phase.
  • Used database triggers to create history of insertion, updating, deletion and all kinds of audit routines.
  • Ensured compliance with all migration policies and procedures within and around the P4P solution in a Very Large Data warehouse (VLDW) environment.
  • Developed initial unload/load programs to migrate production databases from various source data marts to Oracle data warehouse to supply continuous engineering and updates to the data warehouse.
  • Involved in the preparation of documentation for Data warehouse standards, procedures and naming conventions.
  • Ensured appropriate planning and coordination between releases and business/application solutions.
  • Identified synergies between batch streams, to avoid unnecessary redundancies in data management, sourcing, storage and/or manipulation.

We'd love your feedback!