Search found 30 matches

by rotendd
February 22nd, 2017, 5:04 pm
Forum: Dispersion Model
Topic: Looping a Complex Emission
Replies: 5
Views: 1851

Re: Looping a Complex Emission

Thanks! Is there any documentation that explains how the hycs_std.exe file incorporates both the EMITIMES file and the CONTROL file versus when it is run with only the CONTROL file?
by rotendd
February 18th, 2017, 4:35 pm
Forum: Dispersion Model
Topic: Looping a Complex Emission
Replies: 5
Views: 1851

Looping a Complex Emission

First of all, thank you for all of your continued help! I have two (hopefully quick) questions: 1.) In the working directory, I have a CONTROL file, EMITIMES file, and an ASCDATA.CFG data file. I am using an R script to loop through every day of the year 2012 (by rebuilding the CONTROL and EMITIMES ...
by rotendd
February 18th, 2017, 1:01 pm
Forum: Dispersion Model
Topic: Incorporating Exhaust Velocity
Replies: 2
Views: 1748

Re: Incorporating Exhaust Velocity

Thanks again!
by rotendd
February 14th, 2017, 9:17 pm
Forum: Dispersion Model
Topic: "Daily" Runs (under special runs) Only Works for One Day
Replies: 2
Views: 1630

Re: "Daily" Runs (under special runs) Only Works for One Day

Thank you very much, ariel.stein! Sorry for the delayed response (I do not get any notifications of replies. I'm not sure if that's normal for this site). I have since written a script in R that will allow me to loop HYSPLIT and change parameters in the CONTROL file with a bit more flexibility than ...
by rotendd
February 14th, 2017, 9:11 pm
Forum: Dispersion Model
Topic: Incorporating Exhaust Velocity
Replies: 2
Views: 1748

Incorporating Exhaust Velocity

I am looking to model the CO2 emissions of large industrial complexes as accurately as possible. Is there a way to include the initial vertical velocity of the particles as they are exiting the stack? Most of the inputs that I have seen are for emission rate (mass/hr). Is there a way to include a ve...
by rotendd
January 18th, 2017, 5:28 pm
Forum: Dispersion Model
Topic: "Daily" Runs (under special runs) Only Works for One Day
Replies: 2
Views: 1630

"Daily" Runs (under special runs) Only Works for One Day

I am trying to run 24hr dispersions from a point source every day for a month. (Eventually, all 12 months will be used.) This option will produce 30 or 31 output files as expected but only the first day of each month is usable. The rest will not open as a PS file and throws the error that I have inc...
by rotendd
January 16th, 2017, 1:10 pm
Forum: Dispersion Model
Topic: Output Plot of Daily Runs
Replies: 1
Views: 1426

Output Plot of Daily Runs

I am using the "daily" option to run 24hr dispersions for the months of January, February, and March during the year of 2012. I am then merging these three months together. When I plot the merged file from these three months, the plot clearly sums them but the title states "integrated from 0000 Marc...
by rotendd
September 28th, 2016, 6:07 pm
Forum: Users
Topic: Averaging HYSPLIT Outputs
Replies: 7
Views: 1932

Re: Averaging HYSPLIT Outputs

Excellent! But is there a way to automate the collection of the individual files? For example, is there a batch file that I can use to just click start once and have HYSPLIT output data for 48hr periods? I would like to add on to this previous question. Is there a way to loop HYSPLIT over 48hr peri...
by rotendd
September 27th, 2016, 2:06 am
Forum: Users
Topic: Averaging HYSPLIT Outputs
Replies: 7
Views: 1932

Re: Averaging HYSPLIT Outputs

Excellent! But is there a way to automate the collection of the individual files? For example, is there a batch file that I can use to just click start once and have HYSPLIT output data for 48hr periods?

Thanks again!
by rotendd
September 25th, 2016, 12:21 pm
Forum: Users
Topic: Averaging HYSPLIT Outputs
Replies: 7
Views: 1932

Averaging HYSPLIT Outputs

I am looking to run HYSPLIT for an entire month but looped for 48hr periods. For example: Feb 1st - Feb 2nd, Feb 3rd - Feb 4th, Feb 5th - Feb 6th, etc. but then average them in one output file. Any assistance would be greatly appreciated!