[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: [MiNT] www-browser engines (was: This must be an gcc / ld error!)
On Sat, May 29, 2010 at 8:47 PM, Eero Tamminen <oak@helsinkinet.fi> wrote:
> Hi,
>
> On Saturday 29 May 2010, Paul Wratt wrote:
>> I have investigated Origyn. It is a really good option especially now
>> that webkit is going through size an speed enhancements, both will
>> increase its usability on low spec machines.
>>
>> For the future of Atari Web Browsing, webkit holds the most promise.
>> FF is about to get a "kick in the pants" in the speed and size
>> department, but it is not known for its small foot print, or its
>> usability on mobile devices.
>
> You cannot compare FF and Webkit. Webkit is just a HTML rendering engine
> whereas FF is a complete browser. One that is implemented in JavaScript
> using XUL on top of the native Gecko rendering egine.
>
the point was that there a not many (if any) browsers based on Gecko
written for mobile devices. the Gecko engine used in FF2 may have been
a posibility for Atari, but not the engine used in 3+. The ecko engine
itself will get speed and size overhaull as a result of the FF4
reworking, it is inevitable, but will it be enough for it to be
usable. html5 is considerable, along with css3, we really are getting
towards the end of the practical life cycle of portable apps (of Atari
platforms), even with the speed of the CF cpu (unless there becomes a
pin compatible v5e)
> Native UIs build on top of Gecko were e.g. Gnome Epiphany and the Maemo
> Browser.
>
>
> I'm not so too convinced that there's such a huge difference between
> Gecko and WebKit speed. Sure, WebKit JIT looks really good on synthetic
> benchmarks and it is faster than Gecko one, but I doubt the difference
> is in real www-pages so large. And you need to take into account that JIT
> can take a lot of additional memory[1] and if if you don't have enough
> memory (PCs have GBs of RAM), things are going to be _a lot_ slower.
>
Good point. even rendering html5 can tack up a huge amount of ram,
considering all the extra fluff it supports.
however there are techniques the can (and possibly will have to) be
employed to allow rendering of these pages. without getting into the
full description, as long as "we" have access to good storage and
memory sizes, implementing those techniques (view ports using partial
renders as opposed to complete renders) is still practical (although
it may not be possible on standard 512K machines)
>
> - Eero
>
> [1] Btw, there are www-pages out there which content alone can take
> hundred(s) of MBs of RAM due to huge pictures they have (as background or
> scaled down with image tag width/height) and on some pages JS itself
> can also use huge amount of memory. Happily we don't have Flash. :-)
>
well I for one know that Flash 9 to be functionally usable on a 200Mhz
machine, which is why I'm willing to investigate its potential. that's
not the same as it running in a web page, which nowadays is mostly a
waste of time (ads)
As I have said before (Christmas or so), I have some ideas about JS
engines too. I know for a fact that a 1.3 engine is well within the
limits of most of our hardware, but that is really old now. 1.9+ is
almost as bloated as most browsers themselves.
However, if this current round of "can we get a usable browser ported"
comes to a big fat zero, I have an idea, two actually, and combined
they would give a usable browser. An html5/xml renderer and a JS
compiler or precompiler. I believe it is possible for these to
function well even on a bog standard ST.
however that will obviously need to be written from scratch, and
before that happens there is some more investigation into HW's
potential, and a look at current JS engine possibilities
I need a usable html5/xml renderer and a JS compiler or precompiler
for another project, so it may still happen anyway.. we will see
Cheers
Paul