We provide IT Staff Augmentation Services!

Solution Architect/admin Resume

2.00/5 (Submit Your Rating)

Herndon, VA

SUMMARY:

  • Over 20 years’ experience in Information Technology as an architect and developer in Business Intelligence/Database administration and Big Data administration arena using state of the art technologies.
  • Proven history of building large - scale data processing systems and serving as an expert in data warehousing solutions while working with a variety of database technologies.
  • Experienced architecting highly scalable, distributed systems using different open source tools as well as designing and optimizing large, multi-terabyte data warehouses.
  • Experienced with implementation, configuration, security and administration of Hortonworks and Cloudera platforms on-premises and in cloud (AWS).
  • Experienced in integrating the state-of-the-art Big Data technologies into the overall architecture.
  • Possess exceptional talents in identifying problems and creating technological solutions. Characterized as being committed to quality and excellence.
  • Strong Software Development Life Cycle methodology: Requirements through Deployment.
  • Strong analytical, presentation, and excellent interpersonal communication skills.

TECHNICAL SKILLS:

Big data: Hadoop Ecosystem HDFS, MapReduce, Spark, Solr Hive, Oozie, Zookeeper, HBASE, Kafka, Sqoop, Atlas, Ranger, Knox, Hortonworks 2.X, Cloudera 5.X, HUE

Cloud(AWS): EC2,S3, RDSs, Redshift, Lambda, IAM,VPC,SNS,EBS,Route53, CloudWatch.

BI Tools: OBIEE 11g/12c, Oracle BI Apps, BI Publisher, Tableau, Power BI.

Databases: Redshift, Postgres, MySQL, Oracle 11g/12c, SQL Server, Teradata, HBase, Mongo DB

Security: LDAP, Kerberos, SSL, Certs, Encryption, IAM policies, S3 policiesTools: JIRA, GitHub, Build Forge, CVS, Toad, SQL developer,/IP, DNS.

Languages: : Java, XML, JCL, Python, UNIX Shell Programming, SQL, PL/SQL, COBOL

PROFESSIONAL EXPERIENCE:

Confidential, Herndon, VA

Solution Architect/Admin

Responsibilities:

  • Responsible for implementing Hadoop clusters Translation of functional and technical requirements into detailed architecture and design.
  • Implemented and administrated databases like Redshift, and all RDSs.
  • Installed and configured the Kerberos and LDAP.
  • Configured IAM roles and policies
  • Written AWS lambda functions in python and migrated code from EC2 to serverless computation.
  • Migrated databases from on-premise to cloud using AWS DMS service.
  • Installed and Configured Apache Hadoop clusters for application development and Hadoop tools Ranger, Hive, Pig, Oozie, Zookeeper, Hbase, Flume, Sqoop, Spark and Solr and HUE.
  • Good experience in creating shell, python scripts for automation.

Environment: HDFS, Ranger,AWS, Redshift, Postgres. Mysql, Java, Hive, Oozie, PIG, Shell Scripting, Linux, HUE, Sqoop, Spark,, HBase, Solr, Tableau.

Confidential, Reston, VA

Sr. Hadoop Administrator.

ACHIEVEMENT:

  • Installed and Configured Apache Hadoop clusters for application development and Hadoop tools like Hive, Pig, Oozie, Zookeeper, Hbase, Flume, Sqoop, Spark and Solr.
  • Responsible for Cluster maintenance, adding and removing cluster nodes, cluster monitoring and

    Troubleshooting, manage and review data backups and log files.

  • Involved in importing the real time data to Hadoop using Kafka and implemented the Oozie job for daily imports.
  • Developed workflow in Oozie to automate the tasks of loading the data into HDFS and pre-processing with Pig.
  • Designed and implemented Hive and Pig UDF's for evaluation, filtering, loading and storing of data.
  • Configured Hadoop security using Kerberos, Knox and Ranger.
  • Configured HDFS ACL privileges to users and groups.
  • Experience in Implementing High Availability of Name Node and Hadoop Cluster capacity planning by commission and decommission the nodes.
  • Administered and supported distributions of Horton works and Cloudera.
  • Good experience in troubleshoot production level issues in the cluster and its functionality.
  • Backed up data on regular basis to a remote cluster using distcp.
  • Performed operating system installation, Hadoop version updates using automation tools.
  • Implemented Fair scheduler on the job tracker to allocate fair amount of resources to small jobs.
  • Worked on different file formats (ORCFILE, RCFILE, Parquet, TEXTFILE) and different Compression Codecs.
  • Importing and exporting structured data from different relational databases into HDFS and Hive using Sqoop.
  • Implemented rack aware topology on the Hadoop cluster.
  • Diligently teamed with the infrastructure, network, database, application and business intelligence teams to provide high data quality and availability.

Environment: HDFS, Map Reduce, Java, Hive, Oozie, PIG, Shell Scripting, Linux, HUE, Sqoop, Spark,Linux, Oracle, HBase, Flume, Solr, Tableau.

Confidential, Washington DC

Sr. OBIEE Consultant.

Responsibilities:

  • Defined transformation rules for source to target mappings and developed ETL code to support the transformation.
  • Involved in Creating and Administering the Physical Layer, Business Model & Mapping
  • Layer and Presentation Layer using Oracle Business Intelligence Admin tool.
  • Created connection pools, physical tables, defined joins and implemented authorizations in the physical layer of the repository.
  • Created Dimensional Hierarchy, Level based Measures and Aggregate navigation in BMM layer.
  • Managed security privileges for each subject area and dashboards according to user requirements.
  • Developed custom reports/Ad-hoc queries using Oracle Answers and assigned them to application specific dashboards.
  • Developed different kinds of Reports (Drilldown, Aggregation) to meet client requirements.
  • Created proactive agents iBots and interactive Dashboards to alert the financial team about business changes.
  • Managed Oracle Delivers, Scheduler & iBots.
  • Handled Full load and refresh load via staging tables in the ETL Layer.
  • Created BI Publisher Reports on top of BI Answers and aligned them on various Dashboards as per the requirement.
  • Interacting with Data Migration Team Lead, other data migration staff, and other program technical staff as needed to ensure proposed data migration strategy and successful implementation of the code.
  • Ensuring all software developed for migration of production data is developed, controlled and tested to the same quality level as production software.

Environment: Informatica 8.6/9, OBIEE 11g, Oracle 11g, DAC, IBM AIX 6.1, Windows NT, MS SQL Server 2005, PL/SQL, XML Files, SQL developer/Sql plus, Unix Shell scripting, Clear Case and Clear Quest.

Confidential, Morristown, NJ

Sr. Oracle DBA

Responsibilities:

  • Implemented and maintained Oracle 10g/11g Data Guard/standby databases for fail-over purposes.
  • Replicated to a disaster recovery site for increased server manageability and availability.
  • Participated in development of enterprise-wide security standard methodology, and ongoing deployment.
  • Upgraded Goldengate from version 9.0 to 10.2 and also responsible for cross platform migration using Goldengate from Linux to Solaris.
  • Upgraded 10.2.0.4 Two Node RAC ASM Database to 11g R2 Two Node RACASM
  • Used TKPROF, EXPLAIN PLAN, and STATSPACK to improve the performance.
  • Implementing and maintaining database security (create and maintain users, roles and assign privileges)
  • Managed appropriate use of free space within tablespaces, reclaimed space whenever possible. Reorganized tables and indexes within databases when needed.
  • Expert with OEM Grid Control.
  • Performance tuning for optimized results using tools like EXPLAIN PLAN, SQL*Trace, TKPROF, STATSPACK, AWR and ADDM reports.
  • Database tuning, Application Tuning & performance monitoring. Fine tuning Initialization parameters, I/O, Memory and Operating System kernel parameters.
  • Managed RAC Linux servers running Oracle in cluster environments. Performed patching of Oracle databases with latest patches from Oracle.
  • Installed and configured OEM Grid control and intensively used Diagnostic & tuning packs along With AWR & ADDM reports.
  • Implemented Backup strategies for the databases using RMAN, Scripting backups, once in two weeks on the application database.
  • Identified and trouble shot everyday issues like locking, scripts execution, Data Audits etc.

Environment: SQL*Plus, Shell Scripting, SQL, PL/SQL, SQL*Plus Golden gate, UNIX, Red hat Linux, HP-UX, OEM, Oracle 10/11g.

Fannie Mae, Reston, VA

Execution Engineer (ROC)

Responsibilities:

  • Analyzed and reviewed production execution runbooks and PCM tickets thoroughly along with CA’s and EL’s and Integrators, provided required changes for smooth execution of runbooks in production.
  • Pre-excution check are performed to certify the environment/autosys variables to ensure that everything is in place for the runbook execution in production.
  • Executed Hundred’s of Production runbooks effectively on time. Communicated and coordinated with leads and SME’s for all execution failures promptly
  • Extensively used Ab Initio Co>OS commands like m ls, m wc, m dump, m copy,m mkfs etc.
  • Autosys scheduling tool extensively used on Sun Solaris platform.

Confidential, Arlington, VA

Sr. Software Engineer.

Responsibilities:

  • Involved in requirement, analysis and understanding of business requirements.
  • Developed Logical and Physical data models that capture current state/future state data elements and data flows using Erwin.
  • Created Tables, Synonyms, Sequences, Views and Stored procedures.
  • Used Database trigger for making history of insertion, updating, deletion and all kind of Audit routines.
  • Enforced database Integrity using primary keys and foreign keys.
  • Modified existing functions and procedures based on business requirements.
  • Worked extensively on Exception handling to trouble-shoot PL/SQL code.
  • Wrote UNIX shell scripts to automate the daily process as per the business requirement.
  • Used TOAD for PL/SQL development and database access.
  • Fine Tuned SQL Queries for maximum efficiency.
  • Created parameterized reports and customized existing reports for presentations using Cross-Tab reports and Sub-Reports.

We'd love your feedback!