> I am not talking about the copies of a picture that you store for
> yourself
> and the future.
And i did not make it clear enough. I am talking about both, storing for
myself and others, and publishing on the internet. I tried to make clear,
that the download speed depends on how the software on the serverside
represents the picture. If this serverside software does reduce the
definition depending the situation, that is first presents a low def
thumnail, then a higher res, and if and only if needed the real large def,
then both low speed connections and the need for no loss quality is met.
Long, complicated story, but easy to see if you follow the given link and
try it for yourself. Maybe that demo will convince.
>
> I am talking about the images available on the internet for download. Any
> file bigger than about 150k will not display any better on a screen than
> those at 150k. (Of course they will print better)
Not nessecary true, in my case, on a 22" photo editing monitor it really
shows. 2048x1600= a bit more then 150k.
>
> The problem here is the time to download. If I started to look at a web
> page
> and found it had 20 pictures and the first was 3mB and took 6 or minutes
> to
> download I would not be bothering to look at ANY others.
Of course not. Agree that too many web designers assume that everybody has
high speed nowadays. Bad design, bad software.
The essence of the story however remains: Please store on a reliable,
backed up medium like a professional server, do not reduce definition, let
the server software do that when and if needed. If it's Europa related
stuff, free of charge, lots of gratefull builders:
http://forum.okhuijsen.org -> gallery
Jos Okhuijsen
|