Selectivity is a key success factor in the chiral catalyst technologies market. Understanding the fundamental processes that occur when a reagent interacts with a homogeneous single site catalyst, both in its approach and at the active site, is therefore critical to the rational design of new catalysts. Ruthenium-based asymmetric hydrogenation catalysts have been considered as part of a collaborative research project. [(S)-XylBINAP-RuH2-(S,S)-DPEN], first developed by Noyori (1–3), is studied as the parent or prototype model of a series of efficient hydrogenation catalysts, among them the catalysts based on the P-Phos, PhanePhos and ParaPhos ligand families (4).
Introduction The Oxford Battery Modelling Symposium was held in Oxford, UK, from 18th to 19th March 2019. The conference was specifically designed to gather mathematicians, chemists and engineers within the battery modelling community. It was very well received and brought together 170 participants with worldwide representation from academia, research organisations and industry involved in...
Long-distance air travel requires fuel with a high specific energy and a high energy density. There are no viable alternatives to carbon-based fuels. Synthetic jet fuel from the Fischer-Tropsch (FT) process, employing sustainable feedstocks, is a potential low-carbon alternative. A number of synthetic fuel production routes have been developed, using a range of feedstocks including biomass, waste, hydrogen and captured carbon dioxide. We review three energy system models and find that many of these production routes are not represented. We examine the market share of synthetic fuels in each model in a scenario in which the Paris Agreement target is achieved. In 2050, it is cheaper to use conventional jet fuel coupled with a negative emissions technology than to produce sustainable synthetic fuels in the TIAM-UCL and UK TIMES models. However, the JRC-EU-TIMES model, which represents the most production routes, finds a substantial role for synthetic jet fuels, partly because underground CO2 storage is assumed limited. These scenarios demonstrate a strong link between synthetic fuels, carbon capture and storage (CCS) and negative emissions. Future model improvements include better representing blending limits for synthetic jet fuels to meet international fuel standards, reducing the costs of synthetic fuels and ensuring production routes are sustainable.
The catalytic steam reforming process of natural gas consumes up to approximately 60% of overall energy used in ammonia production. The optimisation of the reforming catalyst performance can significantly improve the operation of the whole ammonia plant. An online model uses actual process parameters to optimise and reconcile the data of primary reforming products with possibility to predict the catalyst performance. The model uses a combination of commercial simulator and open-source code based on scripts and functions in the form of m-files to calculate various physical properties of reacting gases. The optimisation of steady-state flowsheet, based on real-time plant data from the distributed control system (DCS), is essential for the application of the model at the industrial level. The simplicity of the calculation method used by the model provides the fundamental basis for industrial application in the frame of digitalisation initiative. The principal aim of the optimisation procedure is to change the working curve for methane regarding its equilibrium curve as well as methane outlet molar concentration. This is the critical process parameter in reforming catalyst operation. An industrial top fired primary reformer unit based on Kellogg Inc technology design served for the validation of the model. Calculation procedure is used for continuous online evaluation of the most commercially available primary reformer catalysts. Based on the conducted evaluation, the model can indicate possible recommendations which can mitigate marginal performance and prolong reformer catalyst lifetime.
1. Introduction There are few mathematical breakthroughs that have had as dramatic impact on the scientific process as the Fourier transform. Defined in 1807 in a paper by Jean Baptiste Joseph Fourier (1) to solve a problem in heat conduction, the integral transform, Equation (i): (i) and its inverse, Equation (ii): (ii) provide the framework to determine the spectral make up of a time...
Introduction In recent years, whenever the subject of digitalisation or digital transformation is brought up for discussion, we normally observe two distinguishing reactions from the attendees: one group is excited and satisfied, the other, interested and worried. Of course, some have a good mixture of both. The former has been from companies, big or small, which have a clear...
The value of using statistical tools in the scientific world is not new, although the application of statistics to disciplines such as chemistry creates multiple challenges that are identified and addressed in this article. The benefits, explained here with real examples, far outweigh any short-term barriers in the initial application, overall saving resources and obtaining better products and solutions for customers and the world. The accessibility of data in current times combined with user-friendly statistical packages, such as JMP®, makes statistics available for everyone. The aim of this article is to motivate and enable both scientists and engineers (referred to subsequently in this article as scientists) to apply these techniques within their projects.
It is human nature to prefer additive problem solving even if removal may be the more efficient solution. This heuristic has wide ranging implications when dealing with science, innovation and complex problem solving. This is compounded when dealing with these issues at an institutional level. Additive solutions to workflows with extra software tools and proprietary digital solutions can impede work without offering any advantages in terms of Findable, Accessible, Interoperable, Reusable (FAIR) data principles or productivity. This viewpoint highlights one possible workflow and the mentality underpinning it with an aim to incorporate FAIR data, improved productivity and longevity of written documents while improving workloads within industrial research and development (R&D).
The design of catalyst products to reduce harmful emissions is currently an intensive process of expert-driven discovery, taking several years to develop a product. Machine learning can accelerate this timescale, leveraging historic experimental data from related products to guide which new formulations and experiments will enable a project to most directly reach its targets. We used machine learning to accurately model 16 key performance targets for catalyst products, enabling detailed understanding of the factors governing catalyst performance and realistic suggestions of future experiments to rapidly develop more effective products. The proposed formulations are currently undergoing experimental validation.
There is an Erratum for this article: https://www.technology.matthey.com/article/66/3/245-245/ In the manufacture of pelleted catalyst products, controlling physical properties of the pellets and limiting their variability is of critical importance. To achieve tight control over these critical quality attributes (CQAs), it is necessary to understand their relationship with the properties...
This paper summarises the results of collaborative research on investment casting of widely used platinum alloys (platinum with 5 wt% ruthenium (Pt-5Ru) and platinum with 5 wt% cobalt (Pt-5Co)) for jewellery purposes. To enable the simulation of the casting process, a materials database was developed as a first step. Casting simulation tools based on computational fluid dynamics (CFD) were used to optimise the casting process parameters and develop an improved understanding of their role. Selected casting trials were conducted using industrial tilt and centrifugal casting machines and the casting process was monitored in detail. Dedicated tree setups for the different machines were optimised using the casting simulation tools. The form-filling, surface quality and microstructure and porosity of the cast items were analysed to investigate the role of different casting parameters and geometrical conditions in the different casting setups. The casting simulation results led to a deeper understanding of the experimental casting results.
Introduction Over the last decade, the term ‘digital transformation’ has become prevalent across a wide variety of organisations. It refers to converting existing manual processes to create a more efficient and agile business environment. In 2018, >70% of organisations were reported as having a digital strategy or working to implement one (1). Johnson Matthey has established both key...
Two widely used jewellery investment casting alloys (platinum with 5 wt% ruthenium (Pt-5Ru) and platinum with 5 wt% cobalt (Pt-5Co)) suffer from poor castability and other drawbacks. In this work thermodynamic calculations of alloy properties were employed to optimise the alloy compositions. Segregation behaviour appeared to be important for the melting range. Scheil simulations were used to simulate segregation under typical casting conditions. Based on these simulation results, small additions of Co were found to significantly improve the castability of PtRu. Casting trials proved that ternary Pt-Co-Ru alloys show superior casting properties, in particular better form-filling and surface quality and reduced grain size and porosity compared to binary alloys. In order to replace Co, further work on other ternary systems appears necessary to study their melting range experimentally.
The theme for this issue of the journal is modelling and its usefulness to Johnson Matthey in a wide range of research and development (R&D) areas. Modelling is one of three core competencies within Johnson Matthey, together with the ability to control materials at the atomic scale, and to characterise materials using state of the art techniques. It forms a crucial component of the...
Models, which underpin all chemical engineering design work, vary widely in their complexity, ranging from traditional dimensionless number correlations through to modern computer based techniques such as computational fluid dynamics (CFD) and discrete element method (DEM). Industrial users require confidence in a model under the conditions it is to be applied in order to use it for design purposes and this can be a reason for slow acceptance of new techniques. This paper explores the validity of models and their validation using a variety of examples from heat transfer, reaction kinetics as well as particle and fluid flow, considering both traditional and modern computational-based approaches.
The examples highlight that when comparing models to experimental data the mathematical form of the equations can contribute to an apparently good ‘fit’ while the actual adjustable parameter values can be poorly predicted; residuals or least squares alone are not a reliable indicator of quality of model fit or of model discrimination. When fitting models to experimental data, confidence in the adjustable parameter values is essential. A finite set of experimental data can fit many different models and often with many sets of parameter values. Not all of these models are of course useful for design. For that purpose it needs to be founded upon the real physics of the system and the adjustable parameters represent real quantities which can be measured, computed or estimated independently. The examples show also the importance of validating a model against more than one output parameter; instances are shown where a too simplistic validation exercise can be misleading.
This paper shows therefore across a range of modelling approaches and applications that extreme care is required when validating a model. Models require validation under the conditions they are to be applied and against more than one output parameter, using appropriate data across appropriate scales and the paper encourages the practice of validating models in order to better persuade industry to adopt more advanced modelling approaches in the future.
Computer simulation has become an important tool for designing automotive emission control systems. This paper highlights some of the key developments made in modelling of diesel emissions control components and catalysts by Johnson Matthey. The general methodology for model development involves determination of the reaction kinetics using laboratory reactor data, followed by validation of the resulting model against vehicle or engine data. The development of models for diesel oxidation catalysts (DOCs), ammonia selective catalytic reduction (SCR) catalysts, lean nitrogen oxides (NOx) traps (LNTs) and diesel particulate filters (DPFs), including coated filters such as the SCR coated DPF (SCRF®), is discussed.
A new methodology for developing models (or at least adapting existing models) using engine or vehicle data, which offers a faster route to a finished model, is also discussed. The use of this methodology to develop a DOC model capable of predicting the effect of platinum group metal (pgm) loading is presented; the model gives a good prediction of carbon monoxide, hydrocarbon (HC), NOx and nitrogen dioxide (NO2) over a vehicle test cycle for pgm loadings in the range 30 g ft–3 to 120 g ft–3.
This review is about the book “Understanding Organometallic Reaction Mechanisms and Catalysis: Computational and Experimental Tools” edited by Valentine P. Ananikov, a professor at the Russian Academy of Sciences who has contributed much to the field of catalysis in the past decade. His work has received international attention along with many awards and research grants. He has been working on developing new concepts in transition metal and nanoparticle catalysis, sustainable organic synthesis and mechanistic studies of complex chemical transformations, which gives him the required background and expertise to edit a book that consolidates various experimental and computational tools used by various research groups around the world.
The performance of a particulate filter is determined by properties that span the macro, meso and atomic scales. Traditionally, the primary role of a gasoline particulate filter (GPF) is to reduce solid particles and liquid droplets. At the macro scale, transport of gas through a filter’s channels and interconnecting pores act as main transport arteries for catalytically active sites. At the meso scale, the micropore structure is important for ensuring that enough active sites are accessible for the gas to reach the catalyst nanoparticles. At the atomic scale, the structure of the catalyst material determines the performance and selectivity within the filter. Understanding all length scales requires a correlative approach but this is often quite difficult to achieve due to the number of software packages a scientist has to deal with. We demonstrate how current state-of-the-art approaches in the field can be combined into a streamlined pipeline to characterise particulate filters by digitally reconstructing the sample, analysing it at high throughput, and eventually use the result as an input for gas flow simulations and better product design.
The present article considers the lattice dynamical study of platinum by use of the Van der Waals three body force shell model (VTBFSM) due to high stiffness constant C11 and C12. The model uses the frequencies of the optical and vibrational branches in the direction [100] and phonon density of states (DOS). The study of phonon spectra is important in determining the mechanical, electrical and thermodynamic properties of elements and their alloys. The present model incorporates the effect of Van der Waals interactions (VWI) and three-body interactions (TBI) into the rigid shell model (RSM) with face-centred cubic (fcc) structure, operative up to the second neighbours in short range interactions. The available measured data for platinum agrees well with our results.