|
I just encountered these words, Skinning, GDI/+, OpenGL, DirectX, Bitmaps and Palletes. Which among these topics do I have to study if I'm going to change the overall look or design of my project (SDI/MDI or Dialog)? It sounds like a theme because even the buttons, edit boxes, other controls or even the background, toolbars, menu, etc, will be changed.
|
|
|
|
|
Hello,
I think that skinning your application will suit you best. This enables certain themes to be used by your users. It even enables them to make their own themes might they want to do that...
GDI/+, OpenGL and DirectX are more for heavy graphics. GDI is more to draw graphs and simple stuff while OpenGL and DirectX are more for the type of graphics you see in games.
Bitmaps and Palletes may come in handy if you want to change colors and things like that at runtime without changing the actual files etc.. You might want to look more into these subjects.
Behind every great black man...
... is the police. - Conspiracy brother
Blog[^]
|
|
|
|
|
Ok, now I know what these stuffs is all about. Thanx
|
|
|
|
|
You're welcome
Behind every great black man...
... is the police. - Conspiracy brother
Blog[^]
|
|
|
|
|
What function is running while a certain dialog or application is on standby-mode?
Because I want to create a program that will scan a certain folder if a certain filename exists on that directory (folder and filename will be entered by the user). While it is scanning the directory, I will display another dialog stating that it is scanning just like a message box except that it has an OK and a Cancel button. The OK button will be disabled until the scanning is not yet done. The Cancel button can be clicked prematurely but will prompt the user if the Scanning is not yet done. In Cancel procedure, OnCancel will do but how about the enabling the Ok button once the Scanning is complete? The Scanning Dialog will be on standby and will just wait until the scanning is complete. I disabled the Ok button on OnInitDialog function of the Scanning Dialog class. Where will I place the enabling of the Ok button? Thanx
|
|
|
|
|
Typically, you are either halted at something like GetMessage() or MsgWaitForMultipleObjects or else in wierd PeekMessage -> DoIdle() type of loop. Depends upon the udnerlying class library.
If you are straight Win32, I would start a secondary thread to do the scanning, and have it post registered messages back to your dialog to advise the dialog of the scanner's progress. If user hits cancel, then just stop thread and exit dialog. Similarly for a MFC app with a dialog.
Be sure to catch possible ways dialog can close, by handling the WM_CLOSE messages generated by escape key, ATL+F4, etc.
|
|
|
|
|
|
Hi all,
I'm facing an odd behaviour in a client application that I'm currently working on. Let me put you in the picture: I create a socket and then call ioctlsocket to set it to non-bloking mode. After this I try to stablish a connection to a server through this socket, so far so good.When I call 'connect' it is supposed to return the control inmendiately with an error such as WSAEWOULDBLOCK but this does not happen at all an it hangs for a while.
<br />
sockaddr_in server ={0};<br />
SOCKET sClient = 0;<br />
<br />
<br />
server.sin_family = AF_INET;<br />
server.sin_port = htons(ESEP_SERVER_PORT_1);<br />
server.sin_addr.S_un.S_addr = htonl(ulIP);<br />
<br />
<br />
if ((sClient = socket(AF_INET,SOCK_STREAM,IPPROTO_TCP)) == INVALID_SOCKET )<br />
{<br />
FormatError(); <br />
return ESEP_ERR_CREATE_SOCKET;<br />
}<br />
<br />
fd_set fdWrite ={0};<br />
TIMEVAL time ={0};<br />
int iRC = 0;<br />
unsigned long ulMode = 0;
<br />
<br />
ulMode = 1;<br />
if (ioctlsocket(sClient, FIONBIO, &ulMode) == SOCKET_ERROR)<br />
{<br />
FormatError();<br />
return ESEP_ERR_CONNECT;<br />
}<br />
<br />
FD_ZERO(&fdWrite);<br />
FD_SET(sClient,&fdWrite);<br />
time.tv_usec = ESEP_CONNECT_TIMEOUT;
<br />
iRC = connect(sClient,(SOCKADDR *)&server, sizeof(server));<br />
if ((iRC == SOCKET_ERROR) && ( WSAGetLastError() == WSAEWOULDBLOCK ))<br />
{<br />
if ( select(0,NULL,&fdWrite,NULL,&time) <= 0 )<br />
{<br />
FormatError();<br />
return ESEP_ERR_CONEX_REJECTED;<br />
}<br />
}<br />
The most curious thing is that this behaviour only takes place in some particular computers because if I run the application in my development machine or in some other machine everything works absolutely right.
I have taken two different machines with the same configuration (everything is the same, they're almost clones) and it works in one system but it doesn't in the other. Has anybody ever had a problem like this? Any reasonable explanation? I'm a bit puzzled because I'm not able to guess why this happens, I've been looking for something in differents forums and dicussion boards but I found nothing. Could you help me please?
|
|
|
|
|
Hi,
I am dealing with OLE automation. Now: I have to convert a string to V_BSTR to use it with OLE. This string is really long. About 500 chars and more! If I convert it to V_BSTR, then it cuts after some hundred bytes (about 230).
How can I avoid it? I use SysAllocStringLen to convert it, with this, normally, it should be allocated enough space for it. But it seems it doesnt care!
VARIANT v1;
V_VT(&v1) = VT_BSTR;
V_BSTR(&v1) = SysAllocStringLen(strToWc(selStr), 1500);
strToWc() is a method by me, which converts a string to widechar. This one works. I checked, and the string is complete!
DKT
|
|
|
|
|
If my memory serves me right then the "B" in BSTR stands for Byte - the byte that is used to hold the length of the string. This is the Pascal-type string representation. As a consequence, a BSTR cannot hold more then 255 chars. No way around it.
Cheers
Steen.
"To claim that computer games influence children is ridiculous. If Pacman had influenced children born in the 80'ies we would see a lot of youngsters running around in dark rooms eating pills while listening to monotonous music"
|
|
|
|
|
Hmmm thats strange...
If I use a direct string:
V_BSTR(&v) = SysAllocString(OLESTR("blabla"));
then, it works. Even if the "blabla" string is more than 500 chars long!
If I use it the way I mentioned before, then it behaves strange:
The first time it cuts some bytes off, the second time and so on, it works!
If I put the same command twice, then it never works...
It seems as if the memory management is totally sh*t!
DKT
|
|
|
|
|
I'm sorry, my memory certainly didn't serve me well. I think I confused it with the ANSI version of BSTR (I think it's called BSTRT).
Just to be sure, you say your function converts it to a widechar, you mean unicode, right?
A BSTR is a pointer to a location, where the first four bytes are the length part and the rest is the unicode string terminated by a double-zero. Could this in any way be the cause of your problem (e.g. a premature terminating double-zero?)
Otherwise, I don't have any good ideas. Perhaps you could post more code?
Cheers
Steen.
"To claim that computer games influence children is ridiculous. If Pacman had influenced children born in the 80'ies we would see a lot of youngsters running around in dark rooms eating pills while listening to monotonous music"
|
|
|
|
|
I checked the double zero, but it didnt make it better...
Here some code:
Function to convert String to widechar:
OLECHAR* ViaExcelConnector::strToWc(const string &cnvrtData) const
{
OLECHAR cnvrt[500];
int i = 0;
char cnvrtChr[500];
for(i=0; i
|
|
|
|
|
Your code got f***ed up because you didn't check the "do not treat <'s as HTML tags" box.
Anyway, you only allocate 500 chars to do the conversion, so it's no wonder that it wont' work for more than 500 chars. And you have a buffer overrun when cnvrtData is more than 500 chars since you don't check for max 500 chars in your for loop. There's no saying what will happen, but you will definitely get your memory screwed up. Furhtermore, you return a pointer to cnvrt which is a stack variable that will go out of scope (and be overwritten) when the function returns.
Why do you move the content of cnvrtData into cnvrtChr? Can't you do the LPCSTR cast directly on cnvrtData? Besides, you should call MultiByteToWideChar with cchWideChar set to zero firt to get the length of the widechar, then allocate it and then convert it:
<br />
int iLength = MultiByteToWideChar(CP_ACP,0,(LPCSTR)cnvrtData,-1,NULL,0);<br />
OLECHAR* cnvrt = new OLECHAR[iLength];
MultiByteToWideChar(CP_ACP,0,(LPCSTR)cnvrtData,-1,cnvrt,iLength;
return cnvrt<br />
but then you will have to remember to delete cnvrt (the return value from strToWc) with delete[] or you'll leak memory.
Cheers
Steen.
"To claim that computer games influence children is ridiculous. If Pacman had influenced children born in the 80'ies we would see a lot of youngsters running around in dark rooms eating pills while listening to monotonous music"
|
|
|
|
|
Oh, I forgot, you should check that the return value from MultiByteToWideChar(....,NULL,0)is non-zero. If it is zero, call GetLastError to get info on why it failed.
Cheers
Steen.
"To claim that computer games influence children is ridiculous. If Pacman had influenced children born in the 80'ies we would see a lot of youngsters running around in dark rooms eating pills while listening to monotonous music"
|
|
|
|
|
Oh perfect. It works now!
BUT:
While testing, I used a string that was 442 chars long, so not longer than those 500!
And no, I couldnt easily convert the string to LPCSTR, cause the string is an own written class (not by me), but I found out that there is a c_str method in this class. And now, it works!
Amazing. Thanks so much, this was a very clear answer, and I can learn alot from it, on how to code better!
DKT
|
|
|
|
|
Glad to help.
Cheers
Steen.
"To claim that computer games influence children is ridiculous. If Pacman had influenced children born in the 80'ies we would see a lot of youngsters running around in dark rooms eating pills while listening to monotonous music"
|
|
|
|
|
Windows has always let you change the colour depth of the desktop e.g. XP lets you choose between 32- and 16-bit colour quality.
But what's the point? Once upon a time, speed and memory might have been an issue but that's hardly relevant now. I'm going nuts trying to make sure my app renders toolbars and other images correctly by supplying bitmaps and icons with different colour depths but is it even necessary?
Do people actually run at a lower colour depth any more?
The two most common elements in the universe are Hydrogen and stupidity. - Harlan Ellison
Awasu 2.1.2 [^]: A free RSS reader with support for Code Project.
|
|
|
|
|
|
Chris Losinger wrote:
yes
Um, OK. Why? Other than corner cases such as running Windows in safe mode, the only reason I could think of was lack of memory or graphics processing power. But surely that's almost a non-issue these days.
The reason I ask is that I'm trying to decide if it's worth wasting spending time supporting multiple colour depths in my app. Currently, I use 32-bit images (where my GUI toolkit supports them, 24-bit otherwise) and fallback to 8-bit when the colour depth is less than 32. Testing shows that my app looks more or less acceptable at lower colour depths but there are one or two places where it's completely wrong (black blocks) and I can't seem to fix them. I'm want to just forget about them say it's not worth spending the time trying to fix them since nobody runs at a lower colour depth
The two most common elements in the universe are Hydrogen and stupidity. - Harlan Ellison
Awasu 2.1.2 [^]: A free RSS reader with support for Code Project.
|
|
|
|
|
Taka Muraoka wrote:
Why?
low-powered graphics cards on cheap/older laptops.
Taka Muraoka wrote:
The reason I ask is that I'm trying to decide if it's worth wasting spending time supporting multiple colour depths in my app
personally, i wouldn't bother.
Taka Muraoka wrote:
there are one or two places where it's completely wrong (black blocks)
how are you doing your RGB 16<->RGB 32 conversion ?
Cleek | Image Toolkits | Thumbnail maker
|
|
|
|
|
Chris Losinger wrote:
low-powered graphics cards on cheap/older laptops.
That's what I figured. People running that kind of hardware are only going to be using the free version of my software anyway
Chris Losinger wrote:
how are you doing your RGB 16<->RGB 32 conversion ?
The problem only occurs when I'm trying to draw 8-bit images on a 16-bit display under W2K (for example. XP is OK).
I don't draw the image myself, I just pass it to my GUI toolkit. BCGSoft's toolkit is generally pretty good but I've already hit one or two rendering bugs (on non-XP systems) so I'm not sure if this problem is being cause by BCG's toolkit, something I'm doing wrong or a problem in the image file.
I've noticed odd things when working with the image file. I fill the background to RGB(255,0,255) but when I reopen the file, many (but not all) of the pixels are RGB(254,0,254). IIRC, 8-bit images have a colour palette but I haven't got a tool that lets me examine the palette. But when I switch in another image file that works, I still get the same problem so I'm somewhat confused
Given that things work OK under XP at all colour depths, I'm suspecting it's a problem with BCG.
The two most common elements in the universe are Hydrogen and stupidity. - Harlan Ellison
Awasu 2.1.2 [^]: A free RSS reader with support for Code Project.
|
|
|
|
|
There are some gotchas when converting to a paletted format. First off (in Windows anyway) you don't get a whole 256 color palette - windows reserves a number of colors for drawing UI elements when the video card is in 8-bit mode. Second, you don't necessarily get 8-bits of precision for each color channel even in the palette - when the hardware is in 16-bit mode, you'll get 5-6-5 or some such division among RGB, while in 8-bit mode, the palette (on older hardware at least) will only support 18-bits for each entry.
It is possible to write a conversion routine that takes this into account, and still produces reasonably good results (tell me if you want code). But you're generally better off using a program where you can hand-tune the results. The free GIMP lets you manually edit the palette of 8-bit images, while the excellent (but not free) PaintShop Pro supports the Windows palette directly. For icons, Microangelo does it all.
|
|
|
|
|
Taka Muraoka wrote:
Um, OK. Why? Other than corner cases such as running Windows in safe mode, the only reason I could think of was lack of memory or graphics processing power. But surely that's almost a non-issue these days.
I wish it was a non-issue.... However, those of us who are using massive amounts of memory are accustomed to applications failing on us.
I have a 256Mb state of the art graphics card, and 2 gig of memory, and still paging off of disk because of earth datasets relative to about 50gig of total data. I normally run at 32bit color, except when we have a particular large dataset that I don't mind sacrificing color depth to display more of the dataset, then I will cut to 16bit color and double my available graphics memory.
My boss promised a new card, but he is holding out for a 512mb current series. I'll have it abused beyond the year end.
Taka Muraoka wrote:
I'm want to just forget about them say it's not worth spending the time trying to fix them since nobody runs at a lower colour depth
I would say check with your user base. Like I said those of us who have strange needs are used to strange failures because of them.
_________________________
Asu no koto o ieba, tenjo de nezumi ga warau.
Talk about things of tomorrow and the mice in the ceiling laugh. (Japanese Proverb)
|
|
|
|
|
Jeffry J. Brickley wrote:
However, those of us who are using massive amounts of memory are accustomed to applications failing on us.
He he. I can think we can safely categorize this as a corner case. If someone comes to me after having pushed their PC to within an inch of its limits and then some, and then complains that the colours in my program go a bit funny, well, that bug report goes straight to the bottom of the pile
The two most common elements in the universe are Hydrogen and stupidity. - Harlan Ellison
Awasu 2.1.2 [^]: A free RSS reader with support for Code Project.
|
|
|
|
|