Research Assistant Professor in the Department of Civil Engineering (Ref.: 530755), to commence as soon as possible for two years, with the possibility of renewal.
Duties and Responsibilities (Each RAP will focus on one of following duties)
- Human-Robot Collaboration-based Digital Twin in Production Processes. (1) Perception: Alignment, representation, and fusion of multi-source heterogeneous data; recognition and prediction of actions, intentions, etc. (2) Decision-making: AI agent-based task planning and allocation. (3) Control: End-to-end control based on reinforcement learning.
- Intelligent Production Scheduling. (1) Distributed Factory Modular Construction Production: Intelligent work package planning and scheduling optimization for modular construction (LP, hyperheuristics, dynamic programming, reinforcement learning). (2) Modular Construction Project Scheduling: Stochastic planning, distributed robust optimization. (3) Progress Auto-Generation Based on Large Language Models: (NLP4Op, natural language processing, generative models).
- Production Knowledge Engineering. (1) Floor Plan Generation and Intelligent Configuration Optimization: Based on generative models and graph models. (2) Exploration of Ontology and Knowledge Graph Semantic and Structural Characteristics: Build an ontology and graph understanding and auto-construction system for cross-project and cross-task in the construction domain. (3) Language Model Construction and Application in the Engineering Field: Research on language models suitable for engineering domains, considering the characteristic experience accumulation in construction and the uniqueness, temporality, and multimodal data complexity of engineering projects, including but not limited to synthetic data generation, retrieval-augmented generation, and model continuous learning technologies.
Enquiries about the duties of the post should also be sent to Dr. Xiao LI at shell.x.li@hku.hk.
Requirements
Candidates should possess a PhD in Computer Science, Industrial Engineering, Automation, Civil Engineering, Electronic Information, Artificial Intelligence or any related field. They must be able to commit to full-time employment without any other concurrent employment obligations.
The following skills and knowledge are expected:
- Relevant technical foundation related to the above research directions, such as image processing, deep learning, reinforcement learning, large models, control theory, etc.; a solid foundation in computer science and mathematics; familiarity with simulation software such as Nvidia Omniverse or Gazebo (not a strict requirement).
- Proficiency with mainstream solvers (Gurobi, CPLEX); expertise in at least one programming language (such as Python); familiarity with machine learning or deep learning frameworks (such as PyTorch); knowledge of reinforcement learning fundamentals and algorithms (SAC, PPO, DDPG). Research experience in production scheduling, project scheduling, or code generation is a plus.
- Proficiency in at least one programming language (such as Python); familiarity with machine learning or deep learning frameworks (such as PyTorch); experience in data analysis and multimodal data processing is a plus; research experience in knowledge graphs, natural language processing, or generative models is a plus.
What We Offer
The appointment will commence as soon as possible on a 2-year temporary basis, with the possibility of renewal subject to satisfactory performance and funding availability. A highly competitive salary commensurate with qualifications and experience will be offered. Other benefits include annual leave, medical benefits, and free access to on-campus gyms and libraries.
How to Apply
The University only accepts online applications for the above post. Applicants should apply online and upload an up-to-date C.V. Review of applications will start as soon as possible and continue until December 31, 2025, or until the post is filled, whichever is earlier.
The University is an equal opportunities employer and
is committed to equality, ethics, inclusivity, diversity and transparency