|
3145990 == 0x00300106
That isn't an HRESULT failure code, since the high bit isn't set. Some of the Win error codes are defined relative to a base number -- eg, ERROR_INTERNET_* errors start at 12000 -- so I'd hazard a guess (based on how close the number is to a nice round hex number) that you're seeing a code from some subsystem. I haven't been able to find the #define for it either, though.
|
|
|
|
|
Hi All,
I have got two applications which communicate via DDE. It works fine in Windows 2000 but when I run the applications in Win98, the string passed via DDE changes when it reaches the DDE Server. Any specific changes that I must do to make it work with Win98?
Thanks...
---
Hakuna-Matada
It means no worries for the rest of your days...
It's our problem free, Philosophy
"I think my response was 'What idiot dreamed this up?'" -- Mary Ann Davidson, Oracle's chief security officer, in typical blunt manner, remembering her reaction to the company's scheme to brand its databases as "unbreakable."
|
|
|
|
|
it isn't a Unicode versus Ascii issue ?
|
|
|
|
|
|
HakunaMatada wrote: Windows 98 supports Unicode right?
I don't think it does dude, recompile the application using Ascii and bet it will work fine.
|
|
|
|
|
So what you mean is change all TCHAR to CHAR and changes like those?
---
Hakuna-Matada
It means no worries for the rest of your days...
It's our problem free, Philosophy
"I think my response was 'What idiot dreamed this up?'" -- Mary Ann Davidson, Oracle's chief security officer, in typical blunt manner, remembering her reaction to the company's scheme to brand its databases as "unbreakable."
|
|
|
|
|
No you don't need to that, just define _MBCS instead of _UNICODE in you preprocessor definitions. See Chris Mauder's article on it.
|
|
|
|
|
Ray Kinsella wrote: _MBCS instead of _UNICODE
_MBCS is defined instead of _UNICODE! But it still doesn't work.
---
Hakuna-Matada
It means no worries for the rest of your days...
It's our problem free, Philosophy
"I think my response was 'What idiot dreamed this up?'" -- Mary Ann Davidson, Oracle's chief security officer, in typical blunt manner, remembering her reaction to the company's scheme to brand its databases as "unbreakable."
|
|
|
|
|
The string changes in what way, what length is it. ?
|
|
|
|
|
The Length of the String is somewhere around 268 characters and when it reaches the DDE Server, it adds random characters at the end increasing the length of the string to 284. Any idea why this happens?
---
Hakuna-Matada
It means no worries for the rest of your days...
It's our problem free, Philosophy
"I think my response was 'What idiot dreamed this up?'" -- Mary Ann Davidson, Oracle's chief security officer, in typical blunt manner, remembering her reaction to the company's scheme to brand its databases as "unbreakable."
|
|
|
|
|
sounds like a string termination problem, Windows NT manages memory slightly better, sounds like when you are copying the string into the DDE buffer you aren't terminating it in whatever way it wants it to be terminated. Look up the documentation ... but as a quick fix I would try using ZeroMemory on the buffer before I copied anything into it.
|
|
|
|
|
Thanks a lot. Will look into it and get back to you tomorrow.
Again... Thanks for helping me out.
---
Hakuna-Matada
It means no worries for the rest of your days...
It's our problem free, Philosophy
"I think my response was 'What idiot dreamed this up?'" -- Mary Ann Davidson, Oracle's chief security officer, in typical blunt manner, remembering her reaction to the company's scheme to brand its databases as "unbreakable."
|
|
|
|
|
|
do you can help me with rational rose please?
|
|
|
|
|
You could get help from this[^] forum.
|
|
|
|
|
hey guys... im sure someone here will be able to help me with this one.
Im looking at a log file that contains hex values
I have a hex string that contains a date and time
0xd4, 0x16, 0x12, 0xbe
The date is Word1 and the time is word2:
The date is contained in 0xd4, 0x16
The time is contained in 0x12, 0xbe
I know that each one of these 0x hex vlaues is a byte.
I know that a word is two bytes
Now here is where i am stuck because i dont know how to amalgamate the two bytes together.
Should i be adding them togther or should i be using the MAKELONG API which puts 1 byte
as the high word and the second as the lowword??
|
|
|
|
|
MAKELONG won't do it, a LONG is too big.
0xd4 << 8 + 0x16 will do it, in the first instance. You need to shift one value by 8 bits, so it sits above the other one.
Christian Graus - Microsoft MVP - C++
Metal Musings - Rex and my new metal blog
|
|
|
|
|
ahhh.. i see what your saying..
so 0xd4 = 11010100 and 0x16 = 00010110.
by shifting 0xd4 8 places it becomes 1101010000000000 which leaves room to add the 0x16 to become 1101010000010110 (54294 dec)
Thats excellent.. Thanks mate.
is this the normal way i should be amalgamting hex values then?
|
|
|
|
|
hey Christian, i seem to be tripping up over myself here.
so that works for the date part.. I did the same with the time part. So:
0xd4 <<8 + 0x16 = 54294 (date)
0x12 <<8 + 0xbe = 4798 (time)
now do i add these two results to equal 59092 or do i need to do:
54294 << 16 + 4798.
Doing this though me a massive value: 3558211584
|
|
|
|
|
What date does 54294 represent, and what time does 4798 represent?
"Approved Workmen Are Not Ashamed" - 2 Timothy 2:15
"Judge not by the eye but by the heart." - Native American Proverb
|
|
|
|
|
Sorry i was getting myself in a twist.
the full 4 bytes represent the date AND time which means that i need
to amalgamate the full 4 bytes into a DWORD. (which is 4 bytes or 2 words)
0xd4 <<8 + 0x16 = 54294 (date)
0x12 <<8 + 0xbe = 4798 (time)
so the datetime value is actually:
54294 << 16 + 4798.
3558211584
which i believe to be seconds since the year 1996. - However i am having trouble converting this value to an actual time now! - i am trying to get it into a SYSTEMTIME struct.
|
|
|
|
|
flippydeflippydebop wrote: which i believe to be seconds since the year 1996.
Since 1996 is not considered a special year by the computer, you'll need to add the appropriate number of seconds to that value so that it reflects date/time since, for example, 1-Jan-1970. That's roughly 820,454,400. Now you can use the date/time functions that handle values from 1-Jan-1970.
"Approved Workmen Are Not Ashamed" - 2 Timothy 2:15
"Judge not by the eye but by the heart." - Native American Proverb
|
|
|
|
|
Hi, I am trying to write a "synchronized" function call in my dll so that programmers that use my api can expect to only make single-instance function calls. The basic motivation is that each function call spawns a thread hence I would like to greatly restrict the number of threads spawned (basically 1-and-only-1 thread spawn per function call to my api).
//In my program.h header file
int threadCode = 0; //no problem if threadCode is instantiated in the header file
HANDLE threadMutexEvent;
#define PROC_THREAD 0
#define ERR_THREAD_INUSE 1
//In my program.cpp program file
BOOL APIENTRY DllMain( HANDLE hModule, DWORD ul_reason_for_call, LPVOID lpReserved) {
//problem arises if I instantiate threadCode here
//threadCode = 0;
threadMutexEvent = CreateEvent(NULL, false, true, "MutexEvent");
return true;
}
int WINAPI processRequest() {
WaitForSingleObject(threadMutexEvent, INFINITE);
if(threadCode == PROC_THREAD) {
threadCode = ERR_THREAD_INUSE;
SetEvent(threadMutexEvent);
} else {
SetEvent(threadMutexEvent);
return ERR_THREAD_INUSE;
}
//spawn thread to do some work for me
return 0;
}
As you can see, I am using Win32's synchronization mechanism to do some simple synchronization on the global variable threadCode. As seen in the code comments listed above, everything is fine if I instantiate threadCode in the header file, however, once I shift the instantiation of threadCode to DllMain the synchronization gets messed up. Basically, seperate function calls to processRequest would always "see" threadCode == PROC_THREAD and go ahead to spawn another thread (which is what I am trying to prevent
If all things fail I would fall back on leaving the instantiation in the header file, but I am really curious as to why the 2 seemingly identical instantiation calls would result in 2 drastically different results.
Thanks
|
|
|
|
|
Hi
I think the reason is that in the first case (var in header) initiation is only one time execution, but in the second case depends on dll instance.
I think your variable have to be static, try that way
Regards
David
|
|
|
|
|
Hi David,
Thanks for the reply. I have tried using static and got the same result. I am quite sure only one instance of the dll is being loaded each time my test application is using it as I have other code in DllMain that clears the UI of all user input (which I don't see happening).
Thanks
|
|
|
|