A printable hydrogel made of ultrashort peptides could help shape cells into viable tissues.
Some organisms evolve an internal switch that can remain hidden for generations until stress flicks it on.
A single semiconducting material can produce white light by emitting light across the visible spectrum.
A high-frequency model developed using data from new high-precision rain gauges gives fresh insight into the dynamics of rain and runoff events.
Harnessing the power of deep learning leads to better predictions of patient admissions and flow in emergency departments.
A printable ink that is both conductive and transparent can also block radio waves.
A likeness between genes of the SARS and COVID-19 viruses could inform research into potential treatments.
High-resolution analysis of wind speed across Saudi Arabia can help fast track the expansion of the Kingdom’s emerging world-class wind energy industry.
Novel red LEDs are more temperature stable than those made using the conventional semiconductor of choice.
KAUST Ph.D. graduate Dr. Noha Al-Harthi and doctoral student Rabab Alomairy, have won the German Gauss Center for Supercomputing (GCS) Award for original research that best advances high-performance computing. This makes KAUST the first Middle Eastern institution to receive this prestigious award.
Extreme weather patterns and regions at risk of flooding could be easier to spot using a new statistical model for large spatial datasets.
By training a search agent to make smarter exploratory decisions, relational data can be classified more accurately and efficiently.
Optical fibers wrapped around date palm trunks could help detect this tree’s most destructive pest early enough to save it.
In today’s world, it should come as no surprise that plastic dominates the products that we rely on each and every day. From our technology devices, to our water bottles, plastic is almost always an integral structural component.
A layer-based approach raises the efficiency of training artificial intelligence models.
Light can simultaneously transfer energy and data to underwater devices, but there’s a long way to go before these systems can be deployed.
Machine learning tasks using very large data sets can be sped up significantly by estimating the kernel function that best describes the data.
A self-powered water quality sensor could help fish farmers to monitor pollution in their ponds remotely.
A universal high-performance computing interface allows popular statistical tools to run efficiently on large geospatial datasets.
Multifunctional iron nanowires selectively obliterate cancer cells with a triple-punch combination attack.
A method for finding genes that spur tumor growth takes advantage of machine learning algorithms to sift through reams of molecular data collected from studies of cancer cell lines, mouse models and human patients.
A better mathematical understanding of how big waves form could lead to better prediction of tsunami impacts.
A machine learning method has identified highly elusive amino acid sequences involved in cell morphogenesis and adhesion and in diseases like cancer.
Chaos could help put cyberhackers out of business with a patterned silicon chip that will be uncrackable even in the future.
Better prediction of extreme winds at sparsely observed locations could help optimize the design of wind farms.
Factors influencing the tolerance of barley to saline soils have been uncovered using an advanced robust statistical technique.
What do an electrical engineer, an organic chemist, a materials scientist and a cell biologist all have in common? They invent and improve applications at the interface of biology and electronics.
New methods for training machine learning models are quicker and more accurate than current approaches, previously considered state-of-the-art.
Asrar Damdam is setting up her own biotech company in Silicon Valley while pursuing a Ph.D. at KAUST
Bridging the knowledge gap in artificial intelligence requires an embedding function that helps step between different types of "thinking."
Deep analysis of the way information is shared among parallel computations increases efficiency to accelerate machine learning at scale.