Science topics: EngineeringModeling

Science topic

# Modeling - Science topic

Explore the latest questions and answers in Modeling, and find Modeling experts.

Questions related to Modeling

I am a biologist who is trying to become more quantitative as well as more model savvy. One excellent tool I have stumbled upon is the SimBiology toolbox for MatLab where you can graphically layout your model. This has been very useful but I have come up against a problem: I cannot use a conditional statement.

My goal is to make a bacterial growth model such that when there is sugar present, the bacterial population increases while if there is no sugar present, the cells start to die. Does anyone have experience with this kind of "if-then" statement in this program?

I am also open to any reccomendations on modeling in general or other EASY programs that can be used to make models.

Currently, I'm doing physicochemical studies of ionic liquids (ILs) and MDEA aqueous mixtures. I have density, viscosity, and surface tension data for the ternary mixtures. Do you have any idea or suggestion on how to relate the ternary mixtures (IL-MDEA-water) and an each pure compounds (ILs, MDEA and water) using a model or by setting up an empirical equations.

In Mathematical and computational models, when you estimate and assume the values of some parameters, there is always a strong tendency that your model will not predict an accurate result or such model can even generate some errors.

How do you correct the errors generated by your model especially during a predictive process? and How do you ensure that your computer model generates an accurate result?

Your answers will be highly appreciated

Regards

The numerical modeling of dam-break based on shallow water equations have some similarities between the flow which is produced by a gate valve in a open channel (like upstream head water, downstream kinematic wave propagation of turbulent, etc.). Also the scale is not the same, can they be used as the reference method for modeling the flow water with sediments transport in the sewer collection? If not, which models can be used for this aim? The aim of my study is to model the sediment dynamic under flushing effect of a radial gate valve in a sewer and evaluating the efficiency of the valve.

The collector is free surface with 3m of height and 2m of width.

I'm trying to model the sediment dynamic in a sewer collection under the effect of hydraulic forces (flushing energy) produced by a gate valve. The aim of this research is to develop a 2D model which solves the Navier-Stokes equations. So this model will be the coupling of certain models like hydrodynamic, sediment behaviors, turbulent flow, etc. Which approach can help me in this modeling of this two-phasic (liquid/solid) turbulent flow? Do you have any idea about the type of software that could be used and its applications?

What should be the percentage of support vectors from the entire training dataset in order to decide upon good model performance?

From a training dataset of 84 instances I have got 38 instances as support vectors. Is my model performing well? Is there any underfitting?

Does anyone have expertise in modeling micro hydropower in simulink or maybe have some advice on good reading material?

I was thinking about the Cost-231 Hata Model for the path loss to modelize the LTE signal strength in the metro. Am I wrong?

Every sensor behaves different in different environments. How do you objectively compare them?

I am a PhD student looking at developing models for virtual collaborations in some specific areas such as manufacturing. I wonder if there is any model validation method I could use to validate the models I develop.

A million dollar research grant was issued to reject null hypothesis X. Unlucky researcher A could not find statistical evidence to reject X. With his test, he found a non-significant p value of 0.1. You still believe in the alternative hypothesis and replicate Researcher A's study. You, too, find a p value of 0.1. What is your conclusion? How does this finding influence your beliefs about the null and alternative hypothesis?

Trajectory tracking control of dynamic nonholonomic systems with unknown dynamics.

What are the main components of a good undergraduate computational physics program? Which resources if any are available for new faculty engaged in such initiatives?

Does anyone have experience of modeling a PMSG on the d/q reference frame? I built one simple model according to the governing equation. The inputs are Vd, Vq and Omega. Outputs are id and iq. I set values for input variables which corresponds to zero d-axis value in a steady state of the generator. So I was hoping a zero id value would be obtained. But the results was not as that I wanted. I checked the equation and everything, seemed like nothing was wrong. Anyone had this problem before? Please tell me how to solve the problem.

I am trying to model pinned supports for a rectangular RC floor slab in Abaqus. Does anyone know a good way of doing so? I am new to Abaqus software.

I have a model and want to know if it can be calibrated for the economy of Pakistan.

I've been using J-test for model selection but apparently it's not a good measure when models have different degree of freedoms. To my understanding, Akaike information criterion (AIC) or Bayesian information criterion (BIC) could be used as they consider the degree of freedom as well. I believe either of these two criteria can simply be used in the Indirect Inference method as we have the maximized value of the likelihood function, but how about in the Method of Simulated Moments? Can I just use the negative of the Argmin function or it's more sophisticated than that?

I’m looking forward to hearing your suggestions for model selection (on the use of AIC, BIC, or any other techniques) in Indirect Inference and Method of Simulated Moments.

I need to know how to determine the discharge from the groundwater model, so that optimum pumping rate can be suggested to remediate the contaminated groundwater using pump and treat method.

I am attaching my procedure of unit conversion; I am confused with units of doping density.

Diffuse reflection boundary condition (bc) for the Boltzmann equation is widely used. However, it is not easy to find results on the derivation of this bc and more generally on bc for the Boltzmann equation. I should be interested in any references you may have pertaining to this question.

Which models can be used for simulating EIS of PEM fuel cells?

Some researchers declare that the location is a strategic, while routing is a tactical problem in the location-routing problem. Also, they express that the routes can be re-calculated more and more. Moreover, locations are usually for a much longer period. Therefore, they claim that it is inappropriate to integrate the location and routing in the same planning framework.

I haven't come across any other dance being modeled for automation. Can anyone cite me some papers?

Conference Paper Modeling BharataNatyam Dance Steps: Art to SMart

I used bootstrapping to draw my observations from a population and a structure of a multivariate model. In order to alleviate sampling bias, I intend to estimate the model, say, 1000 times and somehow infer the parameter estimates from the 1000 different models.

I'm thinking of using the mean of these 1000 estimates as my models to estimate for each parameter and use their standard deviation as the parameter's standard error. Is this recommended? Is there literature utilizing this method?

We have two sets of differential equations for neuron dynamics and astrocyte Ca2+ oscillations but with different time scales (Di Garbo 2009). Is it correct that for simulation of these sets simultaneously you must use integrators with 1000 coefficient for rescaling one set?

I'm putting together a list of ongoing modeling works (from different research areas) which try to understand and predict the behavior of an interesting phenomenon. Just a few examples to kick off the discussion:

- In environmental systems, global climate change modeling seems to be a perpetual challenge. Making climate change models accurate enough for quantitative prediction has been bedeviling such models.

- In obesity and nutrition, the current literature provides over 100 statistical equations to estimate basal metabolic rate (BMR) as a function of different attributes (e.g. age, weight, height, etc.), yet understanding how BMR is precisely modeled based on those attributes is an interesting area of research.

There seem to be tons of examples in biology, psychology, economics, engineering, etc, but what are the 'publicly interesting' challenges that you would like to add here?

My research is based on the knowledge management for construction. I am exactly aiming for Building Information Modeling (BIM). What information do you think would be relevant for me to collect through a survey if I want to tie it up with lean construction? Any suggestions on how I should proceed? Thank you.

I have made a 3D interpolation in SGeMS | Stanford Geostatistical Modeling Software.

The target grid used was a masked 3D grid, after interpolation, the results show the grey rectangle which blocked the target grid (see attachment).

Does anybody know how to remove the grey rectangle box?

Please help me with some meso scale modelling of concrete.

Using different models except SWAT model.

What aspects of operational modelling can be formulated by mereotopology?

Mereotopology was used already for modelling assembly process of products. I am interested to use it for defining operational structure of products and developing a formal definition for design process.

By multiscale models I mean models which range across multiple spatial and/or temporal scales. I would be particularly interested if there is a methodology describing how to integrate submodels from different spatial/temporal scales. No specific area was mentioned (e.g. computational systems biology) because I would be interested in the general theory common to most multiscale models, if such a theory exists.

By grey system theory I mean the one defined by Deng Julong in 1982. 'As far as

information is concerned, the systems which lack information, such as structure message, operation mechanism and behaviour document, are referred to as Grey Systems.', Deong Julong.

A lot of research in grey system theory has been done since 1982. There are books and articles, but performing all the calculations manually is not effective, especially when some changes have to be introduced repeatedly.

I am looking to validate the EPIVIT model for in-season virus spread in potato. The original model is in a very old computer language (pascal) and might be difficult to acquire.

Has anyone worked with this model? Any leads on where I can find an updated code?

I'm trying to model a deep excavation near an existing tunnel, using TNO DIANA.

I will be conducting a research on "Revisiting Republic Act No. 9512: An Act to Promote Environmental Awareness Through Environmental Education: Basis for the Development of Environmental Education Model". How are models be developed?

Suppose that a model developed in Ansys with boundary condition free-free is validated by free-free experimental results. Is it correct to use the model with fixed boundary conditions and still hope the results will be real?

With special concern to multimodel selection, linear regression and theoretic-informational approaches (e.g. AICc).

I want to study the dynamic behavior of a complex power system, including some HVDC links. In which simulation software can I study this?

Usually the constant head or the constant flux were added in the literature, both cases neglected the inland groundwater level oscillation caused by precipitation and evaporation.

I wonder if someone knows how to model springs in Abaqus with nonlinear stiffness and yet this stiffness is temperature-dependant?

My aim is to model a bond deterioration between reinforcement and concrete at high temperature.

Can anyone help with controlling for covariates and types of sums of squares?

In the R vegan package, the functions 'rda', 'cca', and 'capscale' can have a conditioning matrix in the formula to control for the effect (`partial out') of some covariates before next step. These functions are for modeling multivariate data.
Also, I have learned that type II and III sums of squares are better for testing the significance of one factor while controlling for the levels of the other factors (for example, the explanation on the types of SS here: http://mcfromnz.wordpress.com/2011/03/02/anova-type-iiiiii-ss-explained/). However, the author also cautioned that if there is a significant interaction, type II is inappropriate while type III can still be used, but the interpretation on the effect of one variable is difficult.
In my data, there is only one dependent variable Y, and four independent variables A, B, C, and D. As C and D do not have a significant effect on Y, I can drop them from my model. It is also known that there is a significant interaction AxB. So is there a way I can still tell the effect of B after controlling for the effect of A and AxB somehow? Or, how should I interpret the effect of B in the presence of a significant AxB interaction?

There are straightforward analogies between electrical, mechanical, acoustics, thermal, hydraulic and fluid systems which are intuitive and useful. Sometimes, we use them as we teach mechanical, thermal and fluid systems. Is it possible to build a neural network using analogs? For example, resistors acn be analogous to the weights of a neural network, etc.

What are most recent techniques and models being used for urban growth modeling?

Are there any other models that can do spatial allocation in a model other than cellular automata?

Multinomial or crdered choice. Which one is applicable?

I am trying to model a surface by using software

The enzyme is obtained from wild bacteria. Its crystallographic structure has not been determined, but the structure of similar enzymes have been determined. How can I study the effect of a ligand on the enzyme structure using computational tools?.

I do a lot of modelling and system analysis. The best mean for that is paper, however, it would be handy to have a piece of software to build these diagrams on a computer for publication, presentations, or for teaching. Up until now I have used vector image software such as Inkscape or Adobe Illustrator.

I wander if there is a menu driven software which can perform variance partitioning (Borcard et al., 1992, Borcard and Legendre, 1994) in order to separate the importance of spatial dependence and spatial autocorrelation for community distribution data?

How can I determine the stress intensity factor from the simulated sandstone particles?

This is something I have been pondering lately after attending a number of related academic and industry-led events, yet no definition is ever made clear: The term 'Big Data' has become a very popular buzz word, yet researchers in many scientific and mathematical fields have been analysing and mining large datasets for many years (eg. satellite data, model data). Does it then, refer to big data in a social sciences or business context, or rather, does it more correctly refer to the increasingly accessible, ubiquitous, real-time nature of the multitude of datasets we are now exposed to (e.g. data from sensors, WSNs, crowdsourcing, Web 2.0)? Or indeed both?

For a prospective occupational cohort where everyone is exposed to one or more chemical agents, examining BMI at follow-up compared to a specific chemical exposure at baseline, is it necessary to control for baseline BMI? Is it better to model change in BMI or BMI at follow-up? There is no unexposed group -- just cohort members unexposed to some agents versus others. All analyses are within-cohort.

Plz suggest some best workstation for modeling studies ?

Hello all, I wanted to purchase workstation for modeling study. If any one suggest some software names/ workstation suitable package for docking and molecular modeling in one. Thanks

I am working on a protein that has a functional site on a surface loop. Papers on docking studies that I went through do not report any docking at surface loops, although internal loops are reported. I am likely to dock ligands at the loop and predict the motion of the protein. Will docking at a surface loop be reliable (because one of the residues is a glycine)? Can anyone suggest any articles that report such kinds of docking.

There are many temperature-dependent insect studies that model distributions of development time (not to be confused with development rate) by applying the popular Weibull function (which is intensively described by Wagner in his 1984 paper “Modelling Distributions of Insect Development Time: A Literature Review and Application of the Weibull Function").

However, most published studies or web-based integrated pest management programs stop short or provide little information regarding the incorporation of this work into phenology simulation models. Are there acceptable methods for applying modelled distribution of development time for different insect life-cycle stages to phenological/voltinism simulation models or is the practice kept separate to avoid introducing even greater variability to an already complicated predicting process?

For example, models using/suggesting data about food, exercise, etc. I am not interested in individual models for recommendation food (recipe or ingredient-base approaches, e.g.), rather a holistic approach for complete healthcare (self) management.

I am currently working on the task of creating information systems of tidal flooding. Can anyone recommend software to make the modeling application?

The current model uses differential equations as it basis. Put could replacing these with a spiking neural network based model be a good next step?

I want to bind a classical project scheduling problem with Cmax objective function. For example, the first and the second activity can start at the beginning of the project. I want to limit the model in which 1st and 2nd activities never run at the same time. But I don’t know which one is scheduled earlier. Moreover, in my model, just these two activities use the same resource. This resource is bounded, hence model is a special case of RCPSP. How can I handle the resource constraint on my model?

I used GMSYS software and I had to put the density contrasts for different rock blocks between the station and the sea level (gravity base level). I have seen this in the manual. I am wondering if this is the correct way or do I need to put zero value for the topography because we are supposed to have corrected the data for topography and slab below the station when the Bouguer is used for modeling.

I tried to apply Sobol 2002 and 2007 sensitivity analysis to a LSM to evaluate for the different parameters the first order and total sobol index respectively Si and ST. But I always find non significant results (eg: ST=0.000000009 and Si= 0.999999222 for all the parameters). For the beginning I thought it was a problem of Monte Carlo dimentionality problem so I tried to make the same experiment but with a huge ensemble size (N= 600000 for only 2 parameters). I found the same results. I'm sure that my codecs are correct because when I use the Ishigami and the sobol benchmark I obtain significant results (e.g., in the attached paper link in the page 11). Could any one help me to understand this problem? Thanks in advance for all you answers and comments.

I need one that has realistic models of muscle and vascular tissues.

I want to decompose an image into multiple scale bands using the TVL1 model in matlab. I have the source code for the TVL1 model but how can I use it for decomposition into multiple bands. Can anyone help me in this regard?

I am looking for an explicit formula that weighs the predictions from each database into a combined one.

I know already "femm" 2D and static

I'm looking to find a copy of FITEQL but after extensive searching can't seem to find it anywhere online. Any help would be appreciated.

Want to be able to do something like this x=[0:1:50]

Can anyone recommend sources of algorithms and numerical methods for modelling and simulation?

I am looking for good source of algorithms and numerical methods for modeling and simulation mainly oriented to structural bioinformatics. I have found this book "BIOLOGICAL MODELING AND SIMULATION A Survey of Practical Models, Algorithms, and Numerical Methods Russell Schwartz". I would like to know of other sources. I prefer a programming-language-agnostic presentation of the algorithms or a python, Fortran, C oriented examples.

I have a legacy code that is written in f77, I am looking for a f77 compiler. I have been using gfortan but it doesn't work well.

We have designed a conceptual framework that could be useful to address the problem of documenting, storing and executing models and algorithms created by ecologists and environmental manager. The attached file shows the most important functions.

The UML Class diagram is a little more extensive, I mean it is more flexible in things like multiplicity, generalization, etc. But in the end it looks like a very detailed ER diagram, so I wonder if a relationship between these diagrams exists?

Solar PV Performace is affected by various factors such as temperature of cell, irradiation, dust etc. Many methodologies are developed numerically and measured. What are the latest trends in this area of research?

I am using multi-agent for modeling Wireless Sensor Networks. Is there any omnet++ extension modules for multi-agent system?

How to develop hybrid model?

The aim of the question is gather from your opinion the models/proposals/framework which are used to measure/define a certain quality level of modeling language for requirements engineering. So far I have a few ideas like QM4MM to evaluate the maturity of the metamodels of these languages or SEQUAL (ant works of Krogstie.)

Simulation of water treatment processes constitutes an important research theme. In comparison with real works at the industrial scale, where is the real place of simulation in developing water treatment technology?

How do you estimate or measure the impact of computational simulations and mathematical modelling on infectious disease research in America, Europe, Africa, Asia and the Middle East? Is there anything like computational epidemiological modelling metrics?

I am trying to study DNA (ligand)-Protein (receptor) Interaction by molecular docking, hence i need .pdb file format for DNA as ligand?.

I would highly appreciate if you could please lend your valuable suggestions/advice to me.

In a directed network of agents passing knowledge to each other how fast does information in a node grow with the number of edges incident on it? Linearly or exponentially with the number of edges? If on one hand the knowledge of the node agent seems to grow with the sum of the knowledge shared through the incoming edges, on the other hand each item of information shared through one edge might recombine with each item of other edges incoming information, to form new items of information. For example if agent A tells me that there is a traffic jam near the shopping center and agent B tells me that today morning there are big sales there, I get to acquire a third peace of information from combining what agent A and Agent B told me, which is that the traffic jam is caused by the sales and won't stop until the sails are over. Any ideas on which model better suits reality? Sorry if this sounds as a rather naïve question, but since it is related, but not central to my research, I did not get to do a literature search on it.

I have the reflectance profiles of soils samples from 360-2500 nm in 1 nm intervals. I want to model the soil chemical attributes using these reflectance data but 1 nm intervals is cumbersome to work with. So far, I've been using the reflectance every 10 nm as predictors in my models. My question is whether there is some way to maximize the strength of the relationship between the reflectance and soil chemical attributes without regressing each 1 nm reflectance value against each soil attributes? My thinking is that since reflectance values are positively correlated with proximity in wavelength, there must be some resampling methodologies.

Do partial differential equations have a lot to do with electrochemistry?

Using SRTM data, I want to make different elevation zones with ArcMap. Can any expert guide me with a simple procedure that I can use in ArcMap for preparation of a number of polygons having certain elevation ranges, in my area of interest.

Steam turbine transfer function modelling.

In designing classifiers (using ANNs, SVM, etc.), models are developed on a training set. But how to divide a dataset into training and test sets? With few training data, our parameter estimates will have greater variance, whereas with few test data, our performance statistic will have greater variance. What is the compromise? From application or total number of exemplars in the dataset, we usually split the dataset into training (60 to 80%) and testing (40 to 20%) without any principled reason. What is the best way to divide our dataset into training and test sets?

Studying the karst aquifer behavior by using Spring Discharge Time series

I’m trying to install a free Windows Fortran compiler to run Abaqus subroutines. I found a couple of compiler, gcc and open64, but they seem not windows compatible (I couldn't install them). If you have any alternative Fortran compiler, please let me know. In addition, would be helpful an explanation about how to link the compiler to Abaqus (I mean what to write in the environmental variables and so on).

Does anybody see an example for hybrid of Lagrangian relaxation and Benders decomposition in MIP?

I’m writing a report about modelling. I’d like to have your opinion on something.

Which are the best quotes for these 2 paragraphs ?

1- In the classical general linear model the response variable is continuous and it follows a normal distribution (linear regression analysis, ANOVA, ANCOVA), with the main estimation method of least squares.

I have only quoted a book: “MacCullagh, P., & Nelder, J. A. (1989). Generalized linear models (Vol. 37). CRC press.".

2- There are a recent increased interest in the application of statistical modeling to medicine, biomedicine, public health, and biology.

I’d like to quote a similar quote of tutorial or review of Ben Bolker: “Bolker, B. M., Brooks,et al., (2009). Generalized linear mixed models: a practical guide for ecology and evolution. Trends in ecology & evolution, 24(3), 127-135.”

I need to simulate a distribution gas network, in particular I need to perform steady-state analysis on this network.

Hello,

there are plenty of codes on the net.. commercial and open-source.

Generally it ends up like: PhD-students develop some promising open-source code. Projects can be funded, a start-up may be started and this is the end of the open-source code. Since no community has grown on this code. Probably the Comsol software has been born like this. Or the Student gets its title, and the code falls asleep.

Why isn't there a central place to coordinate al these openEMS, openFOAM(-extra), FreeCAD, Cluster-DEM, openCASCADE/SALOME (at the limit believed to what open-source should be) and many more?

Lack of homepage maintainer? No academic publications to live on? No but probably a good reputation to earn.

Thanks

Moreover, I really appreciate any idea in the case. This is a semester project and I have no idea about it. Besides, there is not much literature published and I have difficulty coming up with an idea.

How useful this approach (or very similar ideas) will be to the development of a mature Systems Biology?

http://pysb.org/
Python framework for Systems Biology modeling.

I have a problem with modeling PSF according to defocus (depth). So far I just know that the shape of PSF is similar to the aperture but scaled. Here my aperture is not a circular one so (scaled) Gaussian kernel is not good.

The attached image shows the shape of apertures, white means transparent. Currently I just load in the image and use imresize in Matlab to get the size I need, but it is too naive I guess.

So could one teach me how to get a better model about PSF whose size is changing in accordance with depth(defocus) ?

Thanks !

I am currently thinking about what deliverables to request from the students in a new software engineering / software development project. Although the project will involve a significant implementation part, the software engineering part is crucial. To ensure that the students design and model their software system well, I would request the students to model their system to be developed using a variety of notations such as :

During the requirements phase :

• An *activity diagram* to provide a high-level view of the behavior and flow of the software system to be developed.

• Either *use cases* or *user stories* to capture the different usage scenarios that the system should be able to handle.

During the analysis phase :

• A conceptual UML *class diagram* for describing the main concepts of the system to be developed.

• Either an *ORM* (Object Role Modeling) or *ER* (Entity Relationship) diagram to describe the data that the system will need to handle.

During the design phase :

• A detailed UML *class diagram* for describing the static structure of the system in terms of classes and operations.

• UML *sequence diagrams* for describing the dynamic behavior of the system.

• A *relational schema* of the database that the system will be using.

Of course apart from the above many other kinds of diagrams exist such as goal diagrams, feature diagrams, agent models, state machine diagrams, formal specification schemas, object diagrams, deployment diagrams, package diagrams, component diagrams, architectural models, interaction diagrams, message sequence charts, design patterns, and many more.

In your opinion, what kind of diagrams should or could be part of such a project and which diagrams are less important?

What are the existing models and it's limitations? Ohter than EIO-LCA is ther anything being used at a large scale?

Nitrogen is one tricky element to study in fields due to soil variability and the fact that N is very mobile. Experienced folks have told me that using some models, they have not been able to get realistic findings, close to actual. How can one make better simualtions of N response in cereals say in a trial of 3 seasons?

For investigating resonant changes or displacement variation of the beam.