FPGA Network Engineer | London | Hybrid | Up to £60k | Lucrative Stock Options Are you an FPGA Engineer with high-speed computer networking experience?
Do you have a background in Optical Computer Networks & Systems?
Are you looking to work on cutting edge products in AI and Machine Learning?
Then this might just be the role for you!
We are working with a disruptive London based Computer Networking company who are working on the cutting edge of AI and Machine learning technology with applications for increased productivity in Data Centre's and HPCs.
They are looking for FPGA Engineers with strong High-Speed networking experience to join them in their mission to revolutionise AI systems whilst reducing energy consumption and stiving for a sustainable future.
This role is suitable for candidates with a Batchelor's, Masters or PhD in related fields with a strong desire to contribute in a commercial setting.
Key responsibilities: Work within a multidisciplinary R&D team to generate product ideas, product concepts and resolve any issues.
Presenting products internally to management and other stakeholders.
Inhouse and external prototyping.
Delivering FPGA based systems for internal and client use.
Delivering risk analyses, test plans, protocols report writing and documentation for FPGA Systems.
Preferred experience: Experience in FPGA design for Optical Networks and Systems.
Experience with high-speed interfacing and Memory Access such as PCIe, CXL, RDMA, DDR4, Ethernet & GTM.
Understanding of clock domain & crossing techniques.
Understanding of FPGA tool flows (synthesis, partitioning, place & route, timing analysis).
RTL skills with SystemVerilog, Verilog & VHDL.
Experience with Xilinx Versal Premium & Intel Agilex 7.
Scripting experience with TCL and/or Python.
Experience with Questa, ModelSim, GHDL, Verilator, Quartus, Vivado & Vitis.
What's in it for you?
Up to £60k DOE Lucrative stock options.
25 days holiday + bank holidays + Xmas/New Years Shutdown.
Hybrid working.
Relocation assistance.
Visa sponsorship provided.