Discussion Closed This discussion was created more than 6 months ago and has been closed. To start a new discussion with a link back to this one, click here.
Heat transfer in a microscale model using pulsed heating
Posted 19 juin 2024, 02:05 UTC−4 Heat Transfer Version 6.0 4 Replies
Please login with a confirmed email address before reporting spam
Hello everyone,
I am new to COMSOL and I am trying to build a model to verify experimental results acquired in the following paper: https://www.science.org/doi/10.1126/sciadv.adj3825
For brief summary, the experiment used pulsed electron beam to heat a sapphire sample, thereby inducing thermal waves inside the sample. In the paper, the beam (heat source) was moved and the thermocouple (detector) was stabilized. However, in COMSOL, for simplicity, I kept the heat source constant in one position and placed point probes (1 to 5) along the length of the sample, as described in the attached "Schematic figure.jpg" file.
Theoretically, the thermal waves detected at further distances from the heat source are going to delay more compared the ones detected at nearer distances. The phase delay is determined by the following formula: Δθ=θ1-θ2=(√(πf/α))L, in which Δθ is the phase delay difference, f is the pulsed frequency, α is the thermal diffusivity, L is the distance between point 1 with a phase delay θ1 and point 2 with phase delay θ2.
My problem is the simulation results are not consistent with the experimental and theoretical results when comparing the thermal waves detected by point probes 1-5, as in the attached .mph file. So my question is:
Is it appropriate to build such a model using COMSOL (micro-submicron size, monitoring thermal wave to compare phase), since I guess this might not a job that people frequently do with COMSOL? Please kindly point out if I am doing wrong somewhere (which I suppose very possibly because I have little experience with COMSOL) or any ideas to solve this problem. All of your ideas or comments are highly appreciated. Thank you!
Attachments: