Atomfair Brainwave Hub: Semiconductor Material Science and Research Primer / Wide and Ultra-Wide Bandgap Semiconductors / Ultra-Wide Bandgap Oxides
Molybdenum trioxide (MoO₃) has emerged as a critical material in optoelectronic devices due to its favorable electronic properties and stability. With a bandgap of approximately 3.0 eV, MoO₃ serves as an efficient low-work-function hole injection layer (HIL) in organic light-emitting diodes (OLEDs) and perovskite solar cells (PSCs). Its ability to facilitate charge transport while minimizing energy losses at interfaces makes it indispensable in modern device architectures. This article examines the deposition of MoO₃ via thermal evaporation, its role in interfacial energy alignment, and the degradation mechanisms under UV exposure.

Thermal evaporation is the most widely used method for depositing MoO₃ thin films in optoelectronic applications. The process involves heating MoO₃ powder in a high-vacuum chamber, typically at pressures below 10⁻⁶ Torr, to sublime the material onto a substrate. The evaporation temperature ranges between 700°C and 900°C, depending on the desired film thickness and uniformity. Substrate temperature plays a crucial role in film morphology, with room-temperature deposition often yielding amorphous films, while elevated temperatures (100–200°C) can enhance crystallinity. Film thickness is controlled by monitoring deposition rates, usually maintained at 0.1–0.5 Å/s, to achieve optimal optical and electrical properties. Thinner films (5–20 nm) are preferred for HIL applications to ensure minimal optical absorption while maintaining efficient charge injection.

The effectiveness of MoO₃ as a hole injection layer stems from its electronic structure. The material exhibits a work function of approximately 5.3–5.7 eV, which aligns well with the highest occupied molecular orbital (HOMO) levels of common organic semiconductors (e.g., 5.1–5.5 eV for NPB or PEDOT:PSS) and the valence band of perovskite materials (e.g., 5.4–5.8 eV for MAPbI₃). This alignment reduces the energy barrier for hole injection, enhancing device efficiency. Ultraviolet photoelectron spectroscopy (UPS) studies confirm that MoO₃ introduces interfacial dipole layers, further lowering the effective injection barrier by 0.2–0.4 eV. In OLEDs, this results in lower turn-on voltages and higher luminance efficiency. For perovskite solar cells, MoO₃-based hole transport layers (HTLs) improve open-circuit voltage (V_OC) and fill factor (FF) by reducing recombination losses at the anode interface.

Despite its advantages, MoO₃ is susceptible to degradation under prolonged UV exposure, which limits device lifetime. The primary degradation mechanism involves photo-induced oxygen vacancy formation. Under UV illumination (wavelengths < 400 nm), MoO₃ undergoes a reduction from Mo⁶⁺ to Mo⁵⁺ or Mo⁴⁺ states, creating sub-stoichiometric MoO_(3-x). This process increases free carrier concentration, leading to a gradual decrease in work function and a rise in optical absorption. X-ray photoelectron spectroscopy (XPS) studies reveal that UV exposure for 100 hours can reduce the Mo⁶⁺ content by 15–20%, accompanied by a work function reduction of 0.3–0.5 eV. In OLEDs, this degradation manifests as increased driving voltage and decreased luminance over time. In perovskite solar cells, it contributes to faster efficiency decay under operational conditions.

Encapsulation strategies and interfacial engineering are employed to mitigate UV-induced degradation. Incorporating a thin buffer layer, such as NiOₓ or WO₃, between MoO₃ and the active layer can reduce direct UV exposure while maintaining efficient hole extraction. Additionally, optimizing the MoO₃ thickness to 10–15 nm balances charge injection efficiency with reduced susceptibility to photo-reduction. Accelerated aging tests under UV illumination (1 sun equivalent intensity) show that encapsulated devices with MoO₃ HIL retain over 80% of their initial efficiency after 500 hours, compared to unencapsulated devices that degrade to 50% efficiency within 200 hours.

The thermal evaporation process also influences MoO₃ stability. Films deposited at higher substrate temperatures (150–200°C) exhibit improved stoichiometry and reduced defect density, making them more resistant to UV-induced degradation. Post-deposition annealing in oxygen ambient at 200–250°C for 30 minutes further passivates oxygen vacancies, enhancing long-term stability. Electrical measurements confirm that annealed MoO₃ films maintain consistent conductivity and work function even after extended UV exposure.

In perovskite solar cells, MoO₃ serves dual roles as a hole extraction layer and a protective barrier against moisture ingress. Its hydrophobic nature reduces interfacial water diffusion, slowing down perovskite decomposition. However, UV-induced defects can compromise this protective function over time. Combining MoO₃ with hydrophobic polymers like PMMA or polyvinylidene fluoride (PVDF) improves moisture resistance without sacrificing charge transport properties. Stability studies under damp heat conditions (85°C, 85% relative humidity) demonstrate that hybrid MoO₃-polymer HTLs extend device lifetime by 30–50% compared to pure MoO₃ layers.

In conclusion, MoO₃ remains a vital component in high-performance OLEDs and perovskite solar cells due to its optimal energy alignment and charge transport properties. Thermal evaporation enables precise control over film characteristics, while interfacial engineering mitigates UV-induced degradation. Continued research into encapsulation techniques and alternative deposition methods, such as atomic layer deposition (ALD), may further enhance the stability and performance of MoO₃-based optoelectronic devices. Understanding and addressing degradation mechanisms will be crucial for advancing the commercial viability of these technologies.
Back to Ultra-Wide Bandgap Oxides