is the one about vaults (see Figure 1.4) noted by Leonardo de Vinci (1452–1519). This stated that the chords connecting the top of the structure to the extreme points at the bottom should not cross the inner arch, in order to prevent failure (Benvenuto 1990, p. 9).
Figure 1.4. Empirical rule on vaults: lines ab and bc will remain within the wall
Empirical rules were in use until the 18th century. As time went on, structural computations began to come onto the scene. One of the starting points of this trend, advanced by Robert Hooke (1635–1703), was the rule “ut tensio, sic vis”, “extension is directly proportional to force”, which was the starting point of the theory of elasticity. We can assume that iterations of structures have begun, first in terms of dimensions, second in terms of shapes, with the introduction and advent of structural computations. But since the structural theory was only at a primitive level, and computational tools were not available, the iterations were very naïve at the beginning. A method known as the cross method, advanced in 1930 by Hardy Cross (1885–1959), which is technically known as the “moment distribution method”, became a very important tool for engineers in structural computations (Cross 1930; Eaton 2001). The introduction of computers in 1960 enabled designers to do their computations in a much more efficient way. Finally, the FEMinization, defined above, being advanced towards the end of that decade (Felippa 2001), enabled engineers to analyze all types of structures in a very efficient way, eliminating the fear of a higher number of unknowns and of complex structures.
With these appropriate tools and theory, engineers started to make efficient iterations in designing structures, since it was now possible to repeat analyses easily by changing the dimensions of the members and even their placements.
1.3. From FEM to AI
The cross method and those similar, like the slope deflection method, were to be applied to beams and frames only; thus, they had a limited area of application. Other types of structures were being analyzed all by special methods, based on advanced (at the time) theories on elasticity, strength of materials, plates and shells. FEM, in a very short time after its introduction, became capable of handling all types of structures, including trusses, beams, frames, plates, shells, grids, volumes and any combination of them. Except for some special structures, the CPU (central processing unit) time spent for a normal structure was negligible. CPU time was becoming important only when the number of unknowns of the problem at hand was immense and also when there were nonlinearities or other particular characteristics. Therefore, FEM suddenly became the unique tool in engineers’ hands for analyzing buildings, dams, bridges, etc. This, of course, enabled engineers to make iterations on sizes and also on shapes and topology. In fact, making iterations on sizes, i.e. on dimensions, can be done much more easily in comparison to the other two. Making iterations on shapes and especially on topology is much more delicate since changing them may result in also changing many other parameters, the effects of which cannot be foreseen easily.
Reaching this level enabled engineers to make optimizations of structures far better than the time before FEMinization. Choi et al. (2016) cite the work of Schmit (1960) as the initiator of structural optimization with systematical means. Schmit enumerates three steps in this process:
1 1) establish a trial design;
2 2) carry out an analysis based on this trial design;
3 3) and, based on the analysis, modify the trial design as required.
In this computerized time before the introduction of FEM, Schmit used a systematic way of optimizing a structure under different loadings, without any user intervention, on an example of a truss with three bars. In fact, establishing a structural optimization process for a general type of structure is not a stage that has been reached, even in our times. There are several commercial software packages actually on the market. Three of them are compared to one another on several structural optimization problems (Choi et al. 2016). In these packages, analysis is performed by FEM and optimization is carried out by nonlinear optimization methods.
More recent studies on the design of structures make use of metaheuristic algorithms (MAs) in the optimization part of the process (see, for example, Toklu (2009); Ahrari and Atai (2013); Saka et al. (2016); Techasen et al. (2019)). Indeed, MAs are much more versatile than any other mathematical optimization technique, they accept any type of state variables, they can deal with constraints and multiple objectives very easily, and nonlinearity imposes no difficulty on them (Gandibleux et al. 2004; Sirenko 2009; Boussaïd et al. 2013; Collette and Siarry 2013; Toklu and Bekdaş 2014; Sorensen et al. 2018; Almufti 2019).
In most applications, optimization is applied in design for determining the most appropriate structure, as far as topology, shape and size is concerned, analysis being performed by FEM as mentioned above. In some recent applications, analysis is also carried out by optimization processes, through a technique called Total Potential Optimization using Metaheuristic Algorithms (TPO/MAs) (see, for instance, Toklu (2004a); Toklu and Uzun (2016); Toklu et al. (2017); Bekdaş et al. (2019a); Nigdeli et al. (2019); Kayabekir et al. (2020a); Toklu et al. (2020)). TPO/MA is nothing but the application of the FEM method with an optimization process using soft computing methods, instead of solving matrix equations; thus, it deserves to be called the Finite Element Method with Energy Minimization (FEMEM) in a more general way.
The use of metaheuristic algorithms in the analysis of structures, together with topology, shape and size optimization, takes the structural design problem to a very different level. The importance of stochastic algorithms in structural design has already been noted by Kress and Keller (2007). Artificial intelligence (AI), as first imagined by Alain Turing (1912–1954), necessitates computing machines “(1) learning from experience and (2) solving problems by means of searching through the space of possible solutions, guided by rule-of-thumb principles” (Copeland 2000). Metaheuristic algorithms are exactly the ones guided by rule-of-thumb principles. Thus, we can easily say that structural design, with all its components, belongs now to the area known as AI (Lu et al. 2012; Yang 2013; Yang et al. 2014a; Hao and Solnon 2020).
Конец ознакомительного фрагмента.
Текст предоставлен ООО «ЛитРес».
Прочитайте эту книгу целиком, купив полную легальную версию на ЛитРес.
Безопасно оплатить книгу можно банковской картой Visa, MasterCard, Maestro, со счета мобильного телефона, с платежного терминала, в салоне МТС или Связной, через PayPal, WebMoney, Яндекс.Деньги, QIWI Кошелек, бонусными картами или другим удобным Вам способом.