Data Architect Resume
5.00/5 (Submit Your Rating)
SUMMARY
- I am a data architect with 18 years of expertise, working for Confidential. My practice plays a crucial role in building and architecting their Big Data platform which provide enterprise level dash board visualization and services to different groups within Confidential .
- I have expertise in Big Data systems with Hadoop eco systems (HDFS, Storm, Spark, Scala, MapReduce, Hive, Sqoop, Oozie, HBase, Pig, Airflow, Druid) building data warehouse platform for Business Intelligence in Contact Center and Call Center environments. Collecting data from various sources and performing ETL, data massaging and views using Hortonwork HDP platform, Ambari platform for clustering, installing, administration & developing Big Data application for various customers.
- My cloud passion for building big data platforms has helped me in creating solution that uses AWS/Azure services (lamda, s3, emr, glue, data bricks, delta lake, koalas) as on prem/hybrid/cloud infrastructure.
- I have extensive domain expertise in multiple enterprise platforms, J2EE architecture, Client/Server technologies, IVR technologies and large - scale distributed SOA.
- I possess proven track record in achieving results in a fast-paced, multi-vendor, high-pressure environment, with constantly changing requirements and tight deadlines.
TECHNICAL SKILLS
- Hortonworks HDP, Hadoop, Spark, Scala, Ambari, Hive, Pig, NoSQL, MapReduce, Sqoop, Yarn, Airflow, Druid, Databricks.
- IBM WebSphere Voice Response (WVR), IBM WebSphere Unified Messaging (WUM), Nuance Voice Platform (NVP), ISDN-PRI, SIP, Avaya, Genesys
- SOA, Client/Server distributed architecture, OOAD with UML, SAAS on Cloud Computing, MS Visio and IBM RSA tool.
- IBM BPM, Websphere Message Queues, Websphere Broker, IBM ESB,Websphere Broker Tool Kit, Websphere Integration Developer and Process Server, Microservices, Redis.
- GoLang, Java, J2EE, JMS, REST and SOAP web services, Struts, EJB, Maven, Springs Boot, Voice XML, XML, XSD, OSGI, JavaScript, Struts, JPA, JBoss Rule Engine, HttpClient, Swing, BIRT, Ajax, Log4j, Python, SQL/ PL-SQL, Jenkins and Shell Script.
- Oracle RAC, BCMS, IBM Global Mirror, Oracle Data Guard, ODS, DB2, SQL Server, My SQL, MS-SQL Server, Postgress.
- IBM RAD, Eclipse, Ant, QuickBuild, CVS, Perforce, Clear Case, Putty, Cygwin, Jira, Confluence, Web 2.0, RIA, jQuery, Scrum.
- JUnit, JUnitPerf, JMeter, Jprofiler, Jetty, IBM Heap Analyser
- IBM WAS Network Deployment, JBoss, Resin, Tomcat
- AIX, Linux, Windows, IBM Power Series, X-series, VMware, SAN storage
- Softlayer, Azure, AWS, Google Cloud, Docker, Kubernetes.
PROFESSIONAL EXPERIENCE
Data Architect
Confidential
Responsibilities:
- I designed and architected their Big Data platform that builds enterprise level security dash boards using Analytics and Machine Learning.
- The platform is built over the data ops philosophy, considering HADOOP cluster as Data Lake, Spark as recipe and order as Analytics.
- Big Data platform receives unstructured data from various site around the globe, using Airflow coordinator/workflow the platform starts mining the good data using Spark, further the Spark recipe enriches data with enterprise features for running Predictive Models.
- The resulting analytical data is processed using Druid (druid.io) and enterprise dashboards are dynamically created for visualizing comprehensive analysis of various enterprise level internal security threats, employee based information breach, sales predictions, business outlook etc.
Architect
Confidential
Responsibilities:
- I vet architecture document to support business requirement and understand API for various project and further map data entities with API’s and user interfaces.
- I transformed Confidential ebill application to Big Data eco system, using HADOOP as data lake, Spark for data enrichment and transformation, HIVE for SQL based data access and Sqoop for migration of historical data to HADOOP, I used Spark ML decision tree to create model to vet on existing prediction and assumptions.
Software Architect
Confidential
Responsibilities:
- With Avaya I have been working in the capacity of a Software Architect/Sr Consultant for their Professional Services domain, Emerging Products and Technology vertical. I lead all their client engagements for pre sales, new requirements, solutions, delivery and repeat engagements.
- I focus on custom application creation, package solutions, customization of existing products and infra requirements. Building custom BI, DWH & Big Data application for contact and call center customers.
- I used Hadoob stack for building customer predictability matrix solution which uses Sqoop for data import to HDFS from Postgress and Mysql DB, using the imported data is used in Mapreduce programs to build customer predictability matrix to forecast, next days’ call volume in various segments.
- Hive and Pig are used to build base data for circle operations.
- For large customers, the solution was build on multiple namenode and and datanode configuration to support millions of data processing in Hadoop.