Cant get external folder to be read

Sooo sorry for the caps but i have been trying to get this to work for a week. I literally have a standard folder and a JSON folder called data-new-data , ( yes normally it just sits next to css, js in the root dir). it has json files that i add to and the js i have made pulls the data from the folder with things like scandir etc. I can’t get this to work at all is there even a way to get the compiled app to still see the external dir ???

Do you think it would be possible to send us your problem with a sample, so that we can see what you are trying to do? With just words, it’s always difficult to identify what’s happening.

So basically i have a website standar stuff, php index file , some js files , css … and a JSON folder full of json files that i update every week. I have used the software to try and compile the website into an exe but i need to somehow get the compiled result to have an external json folder so i can update the folder every now and again. it doesnt need to be in real time just on init. Can anyone explain how I am to do this, I am totally stuck.

Cheers,
Greg.

what i cant understand is even before the application can start i get an error inside line 88 of my index.php file which is where i have this code…

<?php
							$data = "./data-new-data";
							$dir = scandir($data);

							$users = [];

							// $i starts at 2 to skip the dir entries '.' and '..'

							for ($i = 2, $size = count($dir); $i < $size; ++$i) {
								$users[] = explode("-", $dir[$i])[0];
							//  $split = explode("-", $dir[$i]);
						//	  $users[$split[0]][] = $split[1];
							}

							/*foreach ($users as $key => $value) {
							  $users[$key] = array_unique($value);
							} */

							foreach (array_unique($users) as $user){
								echo '<li> <a class="submenu2" href="graph-1.php?user=' . $user . '">',
								'<i class="icon-bar-chart"></i><span class="hidden-tablet">' . $user . '</span></a></li>';
							}


							 ?>

so we got it working in the end but it turns out that chromium is rubbish for processing the quntity of json data we have… we need gecko and fast

getting crashes with chromium but not with firefox with large json files any way we can use a gecko engine ?

Strange that Chromium is the culprit. It is more likely to be our virtual engine that scans for virtual files first before looking for real files. By large json files, what do you mean? Do you have a lot of JSON files or files large in size?

HI !!

Thanks for the response.

As it happens i finally managed to fix the problem. It was all to do with the syntax of scandir reaching the path taking into account the fact that the source is virtual. so in the end it was the use of …/user-data/ which worked in the end. So it does access external json folder data which is about 100mb files … it barely copes i wonder if there are specific command line arguments to make chromium speed up massive , it only uses 2d canvas and json thats it !

any discounts if i buy today btw ?

Great that it works for you now, except the speed of Chromium…
Please check your PM too :wink:

I checked my PM and realised my mail posts had been reogranised which was really helpful , thanks for that.

Are you planning to release a new version shortly and are there any discount vouchers available ? I just want to make sure I am buying at the “right time” as is it quite expensive for a halfling such as myself :blush:

Best Regards,

Greg.

The new version should come within this month or the next, depending on whether we add new features or not. We have a lot of user requests (which is great of course), so it is also taking time to implement some requests.