Big Data Architect
We are looking for an experienced Big Data Architect to join Hummingbird Technologies to help scale our multi-terabyte image processing pipeline for global agricultural evolution.
You will join a talented team of software engineers, big data engineers, computer vision experts and machine learning researchers, and are expected to get up to speed rapidly in this fast-paced, multi-disciplinary environment. We are not solving trivial problems, but researching and developing to shape the future of crop and farm management through the creation of predictive analysis products, which will be used across the globe to feed the world and minimise the long-term environmental impact of modern, large-scale agriculture.
Hummingbird was founded in 2016 and is the only remote sensing business in UK agriculture to use artificial intelligence which gathers information from drone, plane and satellite technology, combined with weather and soil data and expert plant pathology, to enable precision agriculture. We use most advanced machine learning and computer vision techniques, delivering actionable insights on crop health directly to the field. We are Best British Tech Startup 2019 and driving the next generation of precision agriculture to feed the world in sustainable way!
Existing backers of the business include The European Space Agency, Sir James Dyson, Horizon Ventures, Downing Ventures and Velcourt, the UK’s largest commercial farming operation. It also has tech partners which include Google UK and Cranfield University.
Please note, we are unable to provide sponsorship for this role.
This will be a hands on position guiding our technical decisions when it coms to our Data Processing platform.
- Creating, and developing scalable, fault tolerant, self healing big data architecture
- Defining, developing, and extending processing paths for a variety of image sources (API, Multi-tiff, hyper-spectral, multispectral, RGB)
- Developing and implementing company's data architecture vision with Head of Engineering and CTO
- Driving force of terabyte data processing pipeline scalability, operational quality and KPI
- Day to day technical leadership within distributed engineering team
- Ensuring communication within distributed engineering team by travelling 10-15% time within Europe
- Hands-on polyglot experience as an Engineer in Java (Spring), and Python
- Work experience in Big Data architecture design and implementation
- Experience of API / microservices design patterns and web technologies
- Excellent problem-solving and communication skills
- Experience with multiple Big Data / message broker tools e.g. Kafka, Flink, Akka, Hadoop/HDFS, RabbitMQ, ActiveMQ, Spark
- Production experience of AWS or GCP
- Production experience of Kubernetes or Docker Swarm
- Technical skills in Data Science, Data Ingestion, Data Augmentation, Big Data and Cloud platforms
- Database selection (Document, Graph, Column, Relational), design, implementation, and data modeling.
- Knowledge of the principles governing best practice in Platform Data Architecture, Management & Governance.
- Experience with agile methodology using TDD, BDD, and using Scrum and Kanban
- MSc in Computer Science, Engineering or relevant field
- Experience in applied machine learning, computer vision and image processing
- Experience in geospatial databases, object manipulation and ETL/transformations
- Experience in electronics/embedded systems for UAV’s
- Personal demonstrable interest in environmental impact and sustainabilit
- The opportunity to work on an innovative product and shape the future of the agriculture industry
- Competitive package
- Private Healthcare
- Government Pension scheme
- Flexible working
- Cycle to work scheme
- Extra day off for your birthday
- Discretionary bonus
- Learning & development allowance