We provide IT Staff Augmentation Services!

Solution/data Architect Healthcare Resume

5.00/5 (Submit Your Rating)

SUMMARY:

  • Total 12+ years of IT experience.
  • Data Architecture: TOGAF/ZachmanFramework enterprise strategy and architecture, business requirements definition, guidelines and standards, process/data flow, conceptual / logical / physical design, performance, partitioning, optimization, scalability/throughput, data quality, exception handling, Metadata, Master Data design and implementation, auditing, capacity planning, use cases, traceability matrix etc.
  • Business Intelligence: Requirements analysis, Key Performance Indicators (KPI), metrics development, sourcing and gap analysis, OLAP concepts and methods, aggregates / materialized views and performance, rapid prototyping, tool selection, semantic layers
  • Data Warehousing: Full life - cycle project leadership, business-driven requirements, capacity planning, gathering, feasibility analysis, enterprise and solution architecture, design, construction, data quality, profiling and cleansing, source-target mapping, gap analysis, data integration/ETL, SOA, ODA, data marts, Kimball methodology, star/snowflake design.
  • Application DBA in Production, Development, Test and, Data warehousing environment in oracle, Teradata, SQL Server, NoSQL database Cassandra, Mongo DB.
  • Experience in leading the team of DBA, application, ETL, BI developers, Testing team.
  • Well versed with all phases of SDLC, quality process and
  • Significant experience in Data Modeling for OLTP, canonical modeling, Dimension Modeling for data ware house like Star Schema, snowflake schema.
  • Worked on MDM (master data management).
  • Responsible for detail architectural design.
  • Responsible on Source to target mapping and traceability matrix.
  • Worked on Informatica Power Center tools-Designer, Repository Manager, and Workflow Manager.
  • Worked on informatica administration, performance and tuning of ETL process..
  • Actively involved in data profiling to ensure data quality of vendor data. logical and physical database designing like Tables, Constraints, Index, etc. using Erwin tool, DB Artisan, ER Studio, Power Designer, DB designer.
  • Excellent T-SQL, PL/SQL programming skills like Triggers, Stored Procedures, Functions, Packages. etc in developing applications.
  • Worked on Teradata for BTEQ scripts, Teradata utilities fastload, multiload, tpump.
  • Worked on performance and tuning of Teradata like proper use of PI, PPI, secondary indexes, collect stats, explain the query for execution plan, Collect STATS on VOLATILE tables, avoid casting of data etc.
  • Monitoring and optimizing the performance of the database through Tuning Applications, Tuning Memory and Tuning Disk etc.
  • Worked on golden gate for replication unidirectional, Bi directional etc.
  • Experience Capacity Planning, Space Management, Storage Allocation, Performed database administration like User Management and implement Security using Roles, Privileges, Grants.
  • Experience in Oracle RAC, data guard, ASM, Installation, cloning, backup and recovery using RMAN, export/import, data pump.
  • Tuning of SQL statements using tools like Explain plan, TKPROF, Auto trace, DBMS PROFILER, Gathering statistic, analyzing tables and indexes, using Pipeline function, parallel query, Inline views, analytical function, Bulk Binds, Bulk Collect, Query rewrite, hints.
  • Experience in Table partitioning and rebuilding, validating the Indexes in oracle.
  • Experience in transforming business requirements into technical specifications.
  • Experience in UNIX shell scripting.
  • Worked on SSIS, SSRS.
  • Worked on Tableau, design the reports, dashboards etc.
  • Experience in BI/DW solution (ETL, OLAP, Data mart) like OWB, BODI X1 (Business Object), Informatica, BO Reporting tool, OBIEE.
  • Knowledge of NoSQL databases such as MongoDb .
  • Experience in developing custom UDF's for Pig and Hive.
  • Experience on Hadoop ecosystem components like Hadoop MapReduce, HDFS, HBase, Oozie, Hive, Sqoop, Pig, Zookeeper, Flume including their installation and configuration.
  • Experience in importing and exporting data using Sqoop from HDFS to Relational Database Systems and vice-versa.
  • Knowledge of big data in writing Hive, Pig Queries for Hadoop Distributed File system.
  • Knowledge of job workflow scheduling and monitoring tools like oozie and Zookeeper, administrative tasks such as installing Hadoop, Commissioning and decommissioning, and its ecosystem components.

TECHNICAL SKILLS:

Job Functions: Project Management, Configuration Management, Application Support, DBA tasks, Handling Database, Analysis, Design, Coding, and Testing Documentation, Maintenance.

Databases: Oracle-RAC 12c,11g, 10g, 9i, 8i,MySQL - 5, 4.x, SQL server 2005, 2008, Teradata

DBA Tools: OEM (Oracle Enterprise Manager) 12c, Toad, SQL Navigator, Spotlight, DB Artisan, Platinum SQL Station & Analyzer, DB Vision, DB Designer, PL/SQL Editor, Erwin, Power designer, Oracle designer, ER Studio, Golden Gate, OEM12c, Teradata SQL assistant

Data warehouse Tools: Informatica power center 8.x,9.x, Data integrator XI (Business Object ETL), Business objects(Reports), Cognos, Data stage, Golden Gate, OBIEE, OWB, Data Integrator, Big Data

Tools: D2k (Forms, Reports), Harvest, VSS, Edit Plus 7, Test Director, MS office, MS Project, Clarity, TFS

Languages: PL/SQL, Visual Basic, C, C++, JAVA, D2k (Forms, Reports)

Concept: ORDBMS, RDBMS

Monitoring tools /Others: Nagios, MRTG, Shell Scripting

WORK EXPERIENCE:

Confidential, Portland Maine

Solution/Data Architect Healthcare

Environment: Teradata, oracle, Linux, SQL server, Big data, BO Reports, Tibco, informatica

Responsibilities:

  • Working as OLTP/Data warehouse architect.
  • Responsible for detail architectural design.
  • Responsible on Source to target mapping and traceability matrix.
  • Gathered the requirement from business and implemented the new projects.
  • Data modelling, Dimension Modelling, Database Designing using Power Designer.
  • Working on conceptual, logical, physical data model.
  • Working on master data management (MDM).
  • Working on Worked on Informatica Power Center tools- Designer, Repository Manager, Workflow Manager, and Workflow Monitor.
  • Reverse engineering of databases for performance and business needs.
  • Worked on Teradata for BTEQ scripts, Teradata utilities fastload, multiload, tpump.
  • Worked on performance and tuning of Teradata like proper use of PI, PPI, secondary indexes, collect stats, explain the query for execution plan, Collect STATS on VOLATILE tables, avoid casting of data etc.
  • Working on PL/SQL, stored procedures, functions, triggers etc.
  • Monitoring and optimizing the performance of the database through Tuning Applications, Tuning Memory and Tuning Disk Usage etc.
  • Tuning of SQL statements using tools like Explain plan, Auto trace, Gathering statistic, analyzing tables and indexes, using Pipeline function, parallel query, Inline views, PPI analytical function, Query rewrite, hints etc. in oracle, Teradata, MySQL.
  • Worked on query analyzer, by gathering statistic, analyzing tables and indexes, using option clause in the query, parallel query and analytical function, SQL Profiler table partitioning and rebuilding, validating the Indexes etc. in SQL Server.
  • Worked with BI team on reporting tool like business object.
  • Working with Hadoop team for Ingestion of data from various sources.
  • Working on Hive tables, loading data and writing Hive queries.
  • Worked on POC of Kafka for all integration need.

We'd love your feedback!