The simplest improvment sugesstion, ever: a new text color!
Hey scavier,
we just need a new color to choose for text: orange. Would be really nice, because the color looks cool and modern and most notably: i need the color for my nickname
Actually... Add Orange, but limit its use in nicks only to people from Holland... I mean... We're dutch, we live in the Netherlands, and we call our country Holland... We deserve a seperate color
If orange is added then you can be sure that 99% of the dutch teams will change their teamtags to be orange, yes
Because it's unreadable in any car that has a roof... This may be a bit harsh, but I literally HATE black letters at the moment (in LFS).
Maybe LFS should have drop shadows? A light one for dark colors, and a dark one for light colors.
Issue, LFS player names are limited to 24 bytes. Insisting on a minimum of 6 bytes of hex per colour is a bad idea for obvious reasons. Using the current "format" isnt ideal for a wide variety of colours.
The further problem is that 0-9 colours are all used up, When you input ^10 how do you know if you meant ^10 or ^1 0?
(it's 23 usable characters, Where's the one byte gone?)
Just like you would use brackets on the forum...
^10^
The ^ don't need to be in the username, they can be translated to a binary 'colorcode' of a single byte. There must be some unused bitsequences for characters, so we can precede colors with that bitsequence.
1001cccc = 16 colors. [XP]-TagForce- = 14 characters.
+^white^ +^red^ + white^ + ^blue^ would make 18 characters...
In the current format my name can have just 1 extra character.
(15 characters + 4 * 2 characters for colors)
NULL character terminator probably, unless I miscounted.
LFS currently uses ASCII effectively, and then uses a codepage to swap between each "language". What this means is (and I will admit that I don't know the exact details of codepages, so I could be wrong) that theres still only 0-255 values for each unsigned byte but the codepage controls whether its an english, or other character (basically its a cheap way out of unicode, in terms of total bytes required and converting your program to unicode strings). What this means is that all 0-255 values are "used" in each codepage so theres no real "binary" we can transfer the colours into.
I've not looked at the LFS transactions recently, between client and server, but if I remember correctly, and if it follows LFS world, the colour codes are sent as ^ followed by the number.
The only current solution I can think of, is to still send ^ followed by the number, but make the byte next to the ^ hold the number, if you see what I'm getting at? The problem with this is that they aren't manually editable with a text editor. You'd have to have something in between to convert to the correct byte value.
For instance instead of typing ^9 (which would be 94 57 (in decimal if you looked at the character array)) you'd enter 94 for the ^ again, and then have a the next "byte" with a value of 9 for the colour. This gives you 255 (we can't use 0, as thats NULL, and in C that terminates the string) possible colours (I've tried to explain this, but its potentially rather difficult without a whiteboard, or being together in person and on the same wave length, I fear).
Edit:
#include <stdio.h> #include <stdlib.h>
int main() { // For example my name char name[24]; name[0] = '^'; // set the hat name[1] = 7; // I like white names strcpy(name+2, "the_angry_angel\0"); //copy the rest of my name in printf(name); return 0; }
Now if you ran and compiled this and then ran it, you'd see the following:
^ the_angry_angel
This is because 9 is technically a tab in ASCII, but we dont care about that, all we care about is the true numeric value.
I didn't realize how long it would be to have multiple colors. In that case, we should just have a 256 color pallate. I'm sure 256 colors is plenty. And, even if the username is limited to 24 bytes, it would work fine. Frankly, even a 64 color pallate would be fine.
Which would make it NULL terminated... Which opens whole new possibilities in itself (but beyond the scope of the current discussion)... The current name seems to be a fixed size array of chars (or bytes, whichever), but then, I have no idea how C handles those. In Delphi an array of character is stored in memory like:
[size] : byte; #Size of the array
[0..size-1] : char; # the actual characters
Yes... Basically each codepage is an array of character images from 0 to 255... But each codepage (even in DOS) has a lot of useless characters in them. Basically all you'd need is 0-9, a-z, A-Z, and a whole lot of special characters, like letters with accents... We don't need control characters since we don't control anything on a single line of text (we don't need a CRLF character which does exist in an ASCII codepage and even in the Unicode UTF-8 encoding (for compatibility with... drumroll... ASCII), nor the character for ESC).
If we (with we I mean Scawen ) can somehow define a simple array of 16 characters within a codepage, which are not used, we can use those characters for colors.
Most likely because that's the way we enter it... If there are 16 'color characters' they would send those, instead of ^x. It's basically not that different from the current system... But instead of being lazy and parsing a color everytime it's displayed, my system parses it twice... Once when you read the name from file it parses the ^colornumber^ to a single 4 bit value (and saves it in an 8 bit character), and everytime it's displayed it parses the 8 bit value and changes the color.
That still leaves us the ineffective method of a color change costing us 2 characters in the name, instead of just the one that's really necessary.
I understand what you're getting at...
You want to exchange the actual character (0-9) behind the ^ for an ordinal value ( char(colornumber); )...
So let's say that I want color number 65 displayed, I'd type ^A (A's ordinal value being 65 in an ASCII codepage).
Check all of those codepages, and tell me what they all have in common... The first 33 bytes are all the exact same... Control Characters... We can use bytes 17-32 without problems because they aren't actual characters, and we're bound to not use them for a single line of text anyway.
A simple binary boolean comparison would test for a color character...
0xF0 AND name[x] == 0x10
if that statement is true, then the character is a colorcode.
EDIT:
And of course the color we would want to set would be color[0x0F AND name[x]] (or simply name[x]-16)
Example:
(in binary, hope you understand your 1's and 0's)
let's say that name[x] is a character with ordinal value 25 (a control character).
In binary 25 is 00011001.
11110000 AND
00011001 =
00010000 (0xF0 AND name[x] = 0x10 = true, so it's a colorcode because we defined 16 colors ranging from 00010000 to 00011111 (16 to 31))
to get the color we simply do name[x] - 16 and we have the index for the color we want to set.
couz we anyway will have our flag next to our nickname soon (i am sure) so ther is no need for unreadable black color also not for a colored national nick
Two things, firstly use a different control code character please! Nothing wrong with $, or %, but using ^ would mess things right up in terms of passing the data both for LFS and those mods which strip colour codes out.
Anyway the driver name is limited so if this was to be implemented an abbreviated system might be good, something like this:
$F00 = Red
$0F0 = Green
$F0F = Magenta
etc.
Many of them actually have the black in a seperate set of symbols aside from their name, and trust me it's much more readable. I look down the position order of drivers and if I see a black name it's totaly invisible so as I dont know their name I substitute a name I made up to use for that driver, "Mr. Unsociable". Mixing black in with other colours as you have is better though, provided I get at least 1 syllable to work with .