- Around 13+ Years of IT experience that encompasses Development, Enterprise Architecture Design, Hardware Sizing, Capacity Planning, Siebel CRM Implementation, Performance Engineering, Business Process Modeling, Systems Design, Implementation and Customer Support
- Over 8+ Years of experience into Big Data Platform Administration in Cloudera, Hortonworks & Apache Distros
- Experience in designing and implementing Complex Systems and Providing Solutions for BIG Data (Data Lake) & Data Lakehouse Implementation On-Premise & On-Cloud (Azure, AWS)
- Experience in setting up and supporting Hadoop Clusters On-Premise and on Cloud (AWS, GCP, and Azure) including Design, Security, Capacity Planning, Cluster Setup and Performance Tuning.
- Experience in implementing Open Data Lakehouse Solution to address limitations of Traditional Data Warehouse + Data Lake using Open Source Technologies Apache Iceberg, Delta Lake, Apache Hudi, Dremio on K8's, Ceph S3 & HDP
- Experience in Installing, Configuring, Upgrading and Administering Hadoop and its Eco-System components on Cloudera (CDH), Hortonworks (HDP), Cloudera Data Platform (CDP) and Apache Hadoop Distributions.
- Experience in Upgrading/Migrating CDH v6 Clusters to Cloudera Data Platform (CDP) 7.1.7
- Successfully implemented OpenSource Real Time Analytics Solution (OReTAS) using only latest Open Source Technologies such as Apache Hadoop, Spark, Kafka (KRaft), NiFi, Superset, Druid, PostgreSQL/MySQL.
- In depth understanding of Big Data Hadoop Ecosystem components HDFS, YARN (MRv.2), Hive, Impala, Sqoop, Flume, Spark, Tez, HUE, HBase, Kafka, NiFi, Superset, Druid, Airflow, Solr, Oozie, Sentry, Kudu, Ozone, Atlas, Knox, Ranger, Ranger KMS etc.
- Experience in Apache NiFi Development, developed Flows to Migrate RDBMS Database Schema/Tables to MySQL, PostgreSQL, HDFS, Hive, Kafka, Druid etc. and implemented Change Data Capture (CDC) for RDBMS, migrated Flume workloads to NiFi Flows for Twitter Data Ingestion
- Experience in Docker Containerization and Orchestration using Kubernetes, Application Deployments using Helm, Managing Kubernetes Storages (PV, PVC, Storage Class Provisioners), ConfigMap/Secrets, Networking (Services, Endpoints, DNS, Load Balancers, Ingress), ReplicaSets, DaemonSets, StatefulSets, Service Mesh (Istio, Linkerd), API Gateways, Network proxies (Envoy), GitOps solutions using ArgoCD.
- Experience in DevOps Tools - Git, GitHub/GitLab, Bitbucket, Ansible, Jenkins & HashiCorp Suite - Terraform, Consul, Vault, Boundary, Packer, Nomad, Waypoint.
- 🌍 I'm based in Hyderabad (Telangana), India.
- 🧠 I'm learning DevOps, DevSecOps, GitOps, MLOps, AIOps
- 🤝 I'm open to collaborating on Big Data (Data Lake), DevOps and Data Lakehouse Design & Solutions
name: Goutham Reddy Bojja
role: Systems Engineer 3
skills:
- programming:
- Python
- Bash
- devops:
- Docker
- Kubernetes
- Jenkins
- Ansible
- GitHub Actions
- cloud:
- AWS (EC2, S3, IAM, EKS)
- Terraform
- CI/CD Pipelines
- bigdata:
- Cloudera (CDH, CDP)
- Hortonworks (HDP)
- Apache Hadoop
- Dremio
- tools:
- Linux
- Git
- NGINX
- Helm