• Fur Affinity Forums are governed by Fur Affinity's Rules and Policies. Links and additional information can be accessed in the Site Information Forum.

Windows 7

ToeClaws

PEBKAC exterminator
Hoping Windows 7 they bring that functionality back. And maybe release a single version of the OS? This whole Home, Home Premium/Business, Ultimate thing is just retarded. Hey, lets write Vista Ultimate, and then CRIPPLE it, and sell those as cheaper versions! Yay! I'm glad I didn't pay much for my copy...

Unfortunately no... I just read on Slashdot the other day that they're going to release multiple versions again. >_<

reian: Good move. :) Why consider going to Windows 7 if you got Ubuntu? Games (my #1 reason for a working copy of Windows, heh)?
 

net-cat

Infernal Kitty
i run linux, i am secure *has computer compromised by a tarball*
Okay, I need you to open up a shell and type this...

Unfortunately no... I just read on Slashdot the other day that they're going to release multiple versions again. >_<
Of course they are.

Why sell one OS to everyone for $150 when you can sell the same thing to some people for $400?
 

ToeClaws

PEBKAC exterminator
Of course they are.

Why sell one OS to everyone for $150 when you can sell the same thing to some people for $400?

Aye - of course, my counter thought is "why pay ANYTHING for an Windows OS when you can have a vastly superior one for free?" :mrgreen:
 

net-cat

Infernal Kitty
Aye - of course, my counter thought is "why pay ANYTHING for an Windows OS when you can have a vastly superior one for free?" :mrgreen:
Non sequitur. People are willing to pay money for Windows whether you understand it or not. :p
 

Irreverent

Member
Why sell one OS to everyone for $150 when you can sell the same thing to some people for $400?

Three-tiered marketing (consumer, soho, enterprise)......after all, Marketing is the worlds second oldest profession.
 

Wild_Wolf

New Member
windows 7 awesome i have the beta installed on one of ym computers and i love it so far, i havent had any issues with it.
 

AlexInsane

I does what I says on the box.
Aye - of course, my counter thought is "why pay ANYTHING for an Windows OS when you can have a vastly superior one for free?" :mrgreen:

Because you have to have to WORK at understanding alternative OS's, unlike Windows, which practically wipes the user's ass for them.

tl;dr: Windows is for lazy bums who can't be arsed to care.
 

ToeClaws

PEBKAC exterminator
Because you have to have to WORK at understanding alternative OS's, unlike Windows, which practically wipes the user's ass for them.

tl;dr: Windows is for lazy bums who can't be arsed to care.

Heh, which I felt is a problem all along - the easier the OS gets and the more it does automatically, the less the user ends up learning.

Still, that said, if I were to compare the complexities of installing an OS, updating it, and installing applications, and say compare Windows XP to Ubuntu Linux, Ubuntu is way, WAY easier in every category.

I think the biggest obstacle is "change" - people are used to Windows. We've been cursed with having it on our PCs for nearly 20 years, so we're talking entire generations that have grown up with it and know nothing else. To change to a completely different platform is very intimidating and people tend to avoid it.

Interesting thing though is that it's getting that way anyway. Windows XP tried to mash up the Explorer shell a little, which most people disliked (and you can set it to "classic" mode to go back to the older 98/2000 style interface). Vista really changed up the interface, and I imagine Windows 7 does as well. In staying with Windows, people will have to accept change anyway, so much of the built up apprehension of trying something different is unfounded.

Besides, the beautiful thing about a lot of the other OS's out there is that they come on LiveCDs, so you can boot up the CD and actually see, use and play around with the OS without even installing it on your PC, and that way know if it supports everything and is something that you want to actually go-ahead with.

There are reasons though that Windows will stay on some people's systems, and there always will be. Games, for one, is the big one. Until games are made for other platforms (which is unlikely unless they become a larger market share), Windows is really the only choice for games. And there are also other applications like Photoshop or Visio which are not made for all the other OS's. And then there's just user preference - some people just like it, even having tried other stuff. I mean... some people liked the Chevy Vega, even though every one knew it was a crappy car. :rolleyes: And some systems are so proprietary that nothing else works properly on them or supports them.

I have Windows on two of my machines - my laptop, because it falls into that mega proprietary category (can't even run Windows 2000 or 2003 on it, even though they're based on the same kernel as XP). And my main PC because it's job is Games.
 

net-cat

Infernal Kitty
When it comes to 64 bit, Vista does have some advantages and this is because it does 64 bit and all backwards compatible 32 bit correctly. Windows XP had 64 bit added to it as more of an afterthought, and it uses emulation to handle different backwards compatibility and functions (and that doesn't work very well).
Hmm. I was skimming the thread again and I just noticed this. This is false. Windows XP x64 Edition, like Windows Vista x64, is actually derived from Windows Server 2003 x64 Edition. This is evident in the kernel version (5.2.3790) and by the fact that it follows the Server 2003 update schedule. (Including service packs. Server 2003 and XP x64 are only up to SP2.) All the AMD64 versions of Windows (Server 2003 x64, XP x64, Vista x64, Server 2008 x64) use the same processor features to run 32-bit code at the same speed. They also all share the fact that there's no way to run 16-but code natively. (DOSbox or VirtualPC is required.) And none of them can use 32-bit device drivers.

There was, however, a short-lived Itanium port of XP derived from the Itanium port of Server 2003 that was never released outside of a limited beta that did use software emulation for 32-bit, mostly because the Itanium was not in any way compatible with x86. (That was its downfall, really. It was marketed as a replacement for x86, but wasn't compatible with it. That's why the AMD64 extensions were accepted in the market and Itanium wasn't. I remember back when Intel was still clinging to hope that the Itanium might become a consumer product they were downplaying AMD64 by calling it things like "Extended Memory 64 Technology.")

There are reasons though that Windows will stay on some people's systems, and there always will be. Games, for one, is the big one. Until games are made for other platforms (which is unlikely unless they become a larger market share), Windows is really the only choice for games. And there are also other applications like Photoshop or Visio which are not made for all the other OS's.
Indeed. That's why Windows won out back in the day. Their platform is "open." Anyone can write code that runs on Windows. The Platform SDK is now a free download, and has always been available for relatively cheap.

And it's kind of also what shot them in the foot. Anyone with an internet connection can get the Platform SDK and write crap code for the OS.

... Apple is going to find themselves going down this road since Xcode is also free and they're getting popular. Linux will experience the same if they get popular, too.

I have Windows on two of my machines - my laptop, because it falls into that mega proprietary category (can't even run Windows 2000 or 2003 on it, even though they're based on the same kernel as XP). And my main PC because it's job is Games.
I have XP on my laptop. Mostly because too many things about Linux outright don't work with it. (Standby, Hibernate, Tablet, Screen rotation, Power Management...) I managed to hack an Xorg driver to make the tablet usable in a minimal sense, but the rest of that stuff is rather critical to the operation of a laptop, and I really don't want to spend days poking kernel config variables or building custom kernel after custom kernel because "some guy on the internet" said "it might work if you do this."
 
Last edited:

Irreverent

Member
I think the biggest obstacle is "change" - people are used to Windows. We've been cursed with having it on our PCs for nearly 20 years, so we're talking entire generations that have grown up with it and know nothing else. To change to a completely different platform is very intimidating and people tend to avoid it.

Resistance to change may be part of it, but for the consumer/soho market, I suspect bundling is a big part too. Windows is pre-loaded on 90% of the boxes out there. I think Dell also dropped their linux preload images on residential boxes. Did Gateway's ever get off the ground? And really, what linux nerd would use a pre-loaded linux image anyway? :D

Fewer customer service/techsupport functions that have to be insourced back to NA when basic windows supportcan be outsourced offshore makes windows the model T of OS....any colour as long as its black.
 

ToeClaws

PEBKAC exterminator
Hmm. I was skimming the thread again and I just noticed this. This is false. Windows XP x64 Edition, like Windows Vista x64, is actually derived from Windows Server 2003 x64 Edition. This is evident in the kernel version (5.2.3790) and by the fact that it follows the Server 2003 update schedule. (Including service packs. Server 2003 and XP x64 are only up to SP2.) All the AMD64 versions of Windows (Server 2003 x64, XP x64, Vista x64, Server 2008 x64) use the same processor features to run 32-bit code at the same speed. They also all share the fact that there's no way to run 16-but code natively. (DOSbox or VirtualPC is required.) And none of them can use 32-bit device drivers.

There was, however, a short-lived Itanium port of XP derived from the Itanium port of Server 2003 that was never released outside of a limited beta that did use software emulation for 32-bit, mostly because the Itanium was not in any way compatible with x86. (That was its downfall, really. It was marketed as a replacement for x86, but wasn't compatible with it. That's why the AMD64 extensions were accepted in the market and Itanium wasn't. I remember back when Intel was still clinging to hope that the Itanium might become a consumer product they were downplaying AMD64 by calling it things like "Extended Memory 64 Technology.")

*bows* I stand corrected. I had not much read up on the 64 bit, and I must have read on that prototype and not read anything more. I do know that the 64 bit capabilities of Vista were a lot more polished than in 64, but then, that's often the case. Windows 95 was 32 bit, but the "A" version was a FAR cry from efficient or functional. And wow... the Itanium... that was a horribly failed processor. Not that the idea was bad, but the first iteration of it certainly was.

Indeed. That's why Windows won out back in the day. Their platform is "open." Anyone can write code that runs on Windows. The Platform SDK is now a free download, and has always been available for relatively cheap.

And it's kind of also what shot them in the foot. Anyone with an internet connection can get the Platform SDK and write crap code for the OS.

... Apple is going to find themselves going down this road since Xcode is also free and they're getting popular. Linux will experience the same if they get popular, too.

Oi, crap code, totally. When I originally went for my degree, I wanted to enter programming, but when I got exposed to having to use the Windows libraries... good gods... what a mess. The code was brutally inefficient and very redundant. Actually turned me off to programming and I never went back.

I have XP on my laptop. Mostly because too many things about Linux outright don't work with it. (Standby, Hibernate, Tablet, Screen rotation, Power Management...) I managed to hack an Xorg driver to make the tablet usable in a minimal sense, but the rest of that stuff is rather critical to the operation of a laptop, and I really don't want to spend days poking kernel config variables or building custom kernel after custom kernel because "some guy on the internet" said "it might work if you do this."

*nodsnods* Yep, same batch of fun I had on mine. I was able to get the ACPI stuff to work with some edits to the acpi config, and some modules config tweaking. In the end, the one thing that I could never make work correctly was playing DVDs, which I do constantly on my laptop - there was a problem with the nVidia driver for the older 440 Go GPU and it's never been fixed. The screen updates at a different rate for the top than the bottom and made it unwatchable. Unless someone fixes the driver, there's no way around that one. :/ Will likely get my next laptop from System76 to avoid the proprietary OS curse.
 

net-cat

Infernal Kitty
*bows* I stand corrected. I had not much read up on the 64 bit, and I must have read on that prototype and not read anything more. I do know that the 64 bit capabilities of Vista were a lot more polished than in 64, but then, that's often the case. Windows 95 was 32 bit, but the "A" version was a FAR cry from efficient or functional. And wow... the Itanium... that was a horribly failed processor. Not that the idea was bad, but the first iteration of it certainly was.
Oh, yes. There are many things. Like the distinction between IE 64-bit and IE 32-bit. That existed and was quite prevalent in XP x64 and Server 2003 x64. In Vista x64, they assume you want the 32-bit IE, as few extensions have been ported to 64-bit. (Flash, for example.) The 64-bit is still there should you care to go looking for it.

The transfer from 32-bit to 64-bit generally went a lot smoother than the transition from 16-bit to 32-bit mostly because it was done on NT, rather than a hack on DOS. NT has always been quite portable. Back when it was new, it supported a half a dozen different architectures.

And they made a clean break for some features. We all remember DOS-mode drivers from Win9x. Microsoft decided that there would be none of that shit in 64-bit. If you want a 64-bit OS, you need 64-bit drivers.





*nodsnods* Yep, same batch of fun I had on mine. I was able to get the ACPI stuff to work with some edits to the acpi config, and some modules config tweaking. In the end, the one thing that I could never make work correctly was playing DVDs, which I do constantly on my laptop - there was a problem with the nVidia driver for the older 440 Go GPU and it's never been fixed. The screen updates at a different rate for the top than the bottom and made it unwatchable. Unless someone fixes the driver, there's no way around that one. :/ Will likely get my next laptop from System76 to avoid the proprietary OS curse.
See, that's part of the problem I've encountered with FOSS in general. If the developer doesn't have the problem, then the problem doesn't exist and will never, ever get patched.

*cough*Interlaced PNG in Python Image Library*cough*
 

ToeClaws

PEBKAC exterminator
Oh, yes. There are many things. Like the distinction between IE 64-bit and IE 32-bit. That existed and was quite prevalent in XP x64 and Server 2003 x64. In Vista x64, they assume you want the 32-bit IE, as few extensions have been ported to 64-bit. (Flash, for example.) The 64-bit is still there should you care to go looking for it.

The transfer from 32-bit to 64-bit generally went a lot smoother than the transition from 16-bit to 32-bit mostly because it was done on NT, rather than a hack on DOS. NT has always been quite portable. Back when it was new, it supported a half a dozen different architectures.

And they made a clean break for some features. We all remember DOS-mode drivers from Win9x. Microsoft decided that there would be none of that shit in 64-bit. If you want a 64-bit OS, you need 64-bit drivers.

Agreed, and I believe they also got better at optimizing the chips for 64 bit sooner than later. 32 bit was, technically speaking, released in 1985 with the i80386, but 32 bit was no where near optimized in the CPUs until the Pentium Pro's P6 architecture over a decade later. On top of that, as you said, the 32-bit code was a joke since it was built on the back of a 16-bit OS which was a hackjob from what was originally an 8 bit OS. Bad, bad design.

See, that's part of the problem I've encountered with FOSS in general. If the developer doesn't have the problem, then the problem doesn't exist and will never, ever get patched.

*cough*Interlaced PNG in Python Image Library*cough*

*chuckles* Yep - that's right up there with blackbox programmers - either way, they don't see past a certain point, so they don't worry about it, and that hurts the end product. What burns me most about it when it comes to stuff like the driver issues is that there are no lack of reports coming in from folks all over the world that the issue exists, but more often than not, the developers just ignore them because they either don't understand there's a problem for others, or just don't care because there wasn't one for them. In some instances, it's also because the problem exists for a group that developers consider to be "diminishing" and too small to be concerned over. Like in my case, the 440 Go GPU is 5 years old, and not many are out there anymore so "why scramble to support them?"

Oh well... this is the spice that makes the computer field interesting. :p
 

net-cat

Infernal Kitty
Agreed, and I believe they also got better at optimizing the chips for 64 bit sooner than later. 32 bit was, technically speaking, released in 1985 with the i80386, but 32 bit was no where near optimized in the CPUs until the Pentium Pro's P6 architecture over a decade later. On top of that, as you said, the 32-bit code was a joke since it was built on the back of a 16-bit OS which was a hackjob from what was originally an 8 bit OS. Bad, bad design.
Hmm... How does it go? "Windows 95: A thirty-two bit extension and graphical shell to a sixteen-bit patch to an eight-bit operating system originally coded for a four-bit microprocessor which was written by a two-bit company that can't stand one bit of competition."

To be fair, the x86 series started out as a budget microprocessor, which is why they caught on in the first place. Starting with Pentium, they abandoned the roots of the architecture in favor of a more efficient RISC-like core and slapped an instruction translator on it. (Yay for being able to cram more transistors onto a single chip.) x86 persists because the translator takes an utterly minuscule amount of silicon to implement compared to the rest of the chip, and feature size just keeps getting smaller.


*chuckles* Yep - that's right up there with blackbox programmers - either way, they don't see past a certain point, so they don't worry about it, and that hurts the end product. What burns me most about it when it comes to stuff like the driver issues is that there are no lack of reports coming in from folks all over the world that the issue exists, but more often than not, the developers just ignore them because they either don't understand there's a problem for others, or just don't care because there wasn't one for them. In some instances, it's also because the problem exists for a group that developers consider to be "diminishing" and too small to be concerned over. Like in my case, the 440 Go GPU is 5 years old, and not many are out there anymore so "why scramble to support them?"

Oh well... this is the spice that makes the computer field interesting. :p
Heh. True. Like my adventures getting my GeForce 7600 to work in XP x64. It worked until SP2 was released. I (and many, many others) reported to the issue to nVidia and were ignored. EVGA's answer was that I shouldn't be using Server 2003 or any operating systems derived from that. (Pre-Vista) Ultimately, I had to buy myself a cheap Radeon card and use that in the mean time. Several driver updates came and went, but the issue was never resolved.
 

ToeClaws

PEBKAC exterminator
Hmm... How does it go? "Windows 95: A thirty-two bit extension and graphical shell to a sixteen-bit patch to an eight-bit operating system originally coded for a four-bit microprocessor which was written by a two-bit company that can't stand one bit of competition."

*laughs* Yeah, think that was it - totally true though. :) It's like building an inverted pyramid, and the instability showed rather well in the Windows 9x days.

To be fair, the x86 series started out as a budget microprocessor, which is why they caught on in the first place. Starting with Pentium, they abandoned the roots of the architecture in favor of a more efficient RISC-like core and slapped an instruction translator on it. (Yay for being able to cram more transistors onto a single chip.) x86 persists because the translator takes an utterly minuscule amount of silicon to implement compared to the rest of the chip, and feature size just keeps getting smaller.

Aye, and there in was the mistake with the Itanium - they wanted to break from the x86 translator. Though is a technically sound idea, it also meant breaking the ability of the thing to work with ANYTHING that was not custom designed for it. x86 will likely be around a long, long time (even though it's not directly used). And yes - the move to the RISC-like core was a sweet one. I remember drooling over Alpha chips for the longest time 'cause they made the (then) current crop of x86 CPUs look so feable.

Heh. True. Like my adventures getting my GeForce 7600 to work in XP x64. It worked until SP2 was released. I (and many, many others) reported to the issue to nVidia and were ignored. EVGA's answer was that I shouldn't be using Server 2003 or any operating systems derived from that. (Pre-Vista) Ultimately, I had to buy myself a cheap Radeon card and use that in the mean time. Several driver updates came and went, but the issue was never resolved.

Ouch. Yeah, that reminds me of a few generations of GPUs before that when ATI made the Rage Fury MAXX. The thing was a beast and was easily the most power card of it's day, only, they had very limited drivers for it. It only worked on Windows 98... that's it. Moreover, you had to visit ATI's site to learn some of the custom tweaking you'd need to do to get it to work fully in 98, at that. When 2000, XP and so on came along, the card was pretty much abandoned.
 

dietrc70

Active Member
Heh. True. Like my adventures getting my GeForce 7600 to work in XP x64. It worked until SP2 was released. I (and many, many others) reported to the issue to nVidia and were ignored. EVGA's answer was that I shouldn't be using Server 2003 or any operating systems derived from that. (Pre-Vista) Ultimately, I had to buy myself a cheap Radeon card and use that in the mean time. Several driver updates came and went, but the issue was never resolved.

I remember that was one of my reasons for being pleased with ATI (and sticking with them to this day). I was an early adopter of XP x64, and they had full driver support very early. XP 64 was a great OS. As you say, it was Server 2003-64 in a workstation package. With heavy use, it proved itself noticeably more robust than XP.

Vista 64 was a mess on release. After about 6 months of updates, it became quite decent, though, and it's what I use now.

I used Linux for business servers in 1999-2004, and saved my employer a fortune. I switched to Server 2003 before leaving because I liked it (it is a fine server OS), and because it was much easier to get support.

I don't use Linux because I use lots of apps (i.e. Acrobat Pro and Photoshop) that have no Linux equivalent. I also like Vista aero better than any desktop manager I've seen for Linux. Vista's a good OS at this point.

I've tried out the Windows 7 64 beta, and I'm very impressed. It is noticably faster and leaner, and does not act like a beta at all...it seems more like a late Release Candidate. I think MS did it's homework this time. Vista was really not ready at RTM.
 

Biles

Active Member
Because you have to have to WORK at understanding alternative OS's, unlike Windows, which practically wipes the user's ass for them.

tl;dr: Windows is for lazy bums who can't be arsed to care.

Heh, which I felt is a problem all along - the easier the OS gets and the more it does automatically, the less the user ends up learning.

Geez, you make it sound like it's a bad thing. I understand it's okay for tech savvy professionals to tinker around with powerful workstations and rigs, but don't expect an average Joe or a soccer to care much what is under the hood. The home consumer and small businesses are not going to buy a personal computer simply to toy around with, they want a reliable tool to help them become productive in life, be it home personal, or business.
 

ToeClaws

PEBKAC exterminator
Geez, you make it sound like it's a bad thing. I understand it's okay for tech savvy professionals to tinker around with powerful workstations and rigs, but don't expect an average Joe or a soccer to care much what is under the hood. The home consumer and small businesses are not going to buy a personal computer simply to toy around with, they want a reliable tool to help them become productive in life, be it home personal, or business.

I agree - to a point. Back 10 years ago, for example, Linux-based OS's were clunky and difficult to use. You would never see them as the operating system of a novice user because you had to be really tech-savvy to run it. Same with the BSD world. I remember thinking back then that if Linux and BSD ever wanted to be a real competitor to Windows, they had to become easy enough to use and maintain that a novice could do it. Hardcore geeks were quite against that at the time because they felt it would ruin the OS, but they also came to realize that there would never be any way for the OS to become popular if it didn't make that change.

So, fast-forward to today and you have version of Linux and BSD that are incredibly easy to use - in fact, I would go so far as to say several flavours of them are much easier to install and use than Windows.

But... there is a fine line between making things easier for the users, and making things a little too automated. There's no harm in learning a few simple operations of the computer. For example, when you plug in a USB drive, you used to have to go open up Windows Explorer (in Windows anyway) and go to the drive to access it. In XP and later versions, they added a service that pops up a little window showing the contents and basically offers you stuff to run the contents with. To me, that's an example of going to far. The user now has little concept of what the drive he/she is using really is or where it is, or how to find it in the listing of drives and directories. Instead, they have this cute interface that they can click through to use stuff on it. It wasn't hard to do before, and at least before, users learned a bit more about the file and directory structure of their PCs.

So what I like/want is just a happy balance between automation and user understanding of the system. Windows 2000 (in the Windows world) was probably the best example of a version of Windows that was sufficiently automated enough to make things easy, but manual enough that the user could still learn and understand their computer.
 

RidgeCityFM

New Member
I am SO glad Windows 7 is already on its way. I was afraid I'd have to use Vista when I get a laptop someday, especially after hearing that Vista actually performs worse than XP overall. Then again I never actually heard that from a RELIABLE source.
 

WarMocK

I like to nuke ^^
So what I like/want is just a happy balance between automation and user understanding of the system. Windows 2000 (in the Windows world) was probably the best example of a version of Windows that was sufficiently automated enough to make things easy, but manual enough that the user could still learn and understand their computer.

QFT :D
Win 2k was the best OS Microsoft released so far.
 

dietrc70

Active Member
QFT :D
Win 2k was the best OS Microsoft released so far.

It is definitely a classic. I was quite anti-Microsoft in the 90's. They consistently released crappy DOS-hack OS's, bullied their OEM's to keep out competitors, and killed better software suites with the horrible (at the time) bundled Office. Windows NT was a horrible joke.

Then Win2k came out, and it didn't suck. I couldn't believe it. I am sure that the threat of the anti-trust lawsuit against MS scared them into getting their act together, and they realized they would be toast if they didn't do something to win back the good will of their users (like releasing a piece of software that didn't suck).

I have a Virtual PC Win2K on my Vista 64 machine. It's great for running anything old and weird (or 16-bit).
 

Irreverent

Member
To me, that's an example of going to far. The user now has little concept of what the drive he/she is using really is or where it is, or how to find it in the listing of drives and directories. Instead, they have this cute interface that they can click through to use stuff on it. It wasn't hard to do before, and at least before, users learned a bit more about the file and directory structure of their PCs.

I think you're blurring the line between core OS functionality and user interface design. Earlier OS's lacked robust UI, both as a function of hardware and time to market pressures. As hardware becomes cheaper and faster, UI should evolve to the point where it becomes ubiquitous and pervasive; a "turn the key" and go mentality. The average car driver is more concerned with programing the hvac controls than the PCM and embedded OS that is running the engine management system.

The initial moves to-wards virtualization, application service providers and so-called "cloud computing" would seem to bear this out. The end user should be isolated from the intricacies of OS-to-hardware interfacing.....that's for professionals.

This will be come doubly important once we free ourselves from the dependency and limitations of the current crop of graphical user interfaces. Once UI start to become driven by presence and location awareness, the average user will have little to no use for foundational OS elements (file systems, network stacks, memory alloction et al).

Or put another way, "Computer, tea, Earl Grey, hot!"

Saddly, there's none of this in windows7. Sure, you can add it on as an application, but there's no built in extensibility to the OS.
 
Last edited:
Top