|
did u get the code i have pasted.
i m nt getting wt u have understood and wt u didnt.
void CCalendarControlView::OnFilePrintPreview()
{
AFXPrintPreview(this);
}
CRect rc = rcClient;
void CCalendarControlView::DrawOnDC(CDC * pDC, int isPrinting)
{
memDC.CreateCompatibleDC(pDC);
bmp.CreateCompatibleBitmap(pDC, rc.Width(), rc.Height());
memDC.SelectObject(&bmp);
pMemDC = &memDC;
}
//Now drawinf something on pMemDC
//Now BitBlt
pDC->BitBlt(0, 0, rc.Width(), rc.Height(), pMemDC, 0, 0, SRCCOPY);
this works fine in case of display in window but nuthing comes on OnFilePrintPreview()
if ny prob in getting code please let me knw..
thanx for ur effort
|
|
|
|
|
Google doesn't bring up much for AFXPrintPreview, and i don't know it. Are you sure your drawing code gets called when you call that AFXPrintPreview method?
> The problem with computers is that they do what you tell them to do and not what you want them to do. <
> Life: great graphics, but the gameplay sux. <
|
|
|
|
|
ya i m sure as it works perfectly in case i do not use memery dc in case of print preview and directly do the drawing on display dc.
actually it is given in mfc itself all the code DoPrintPreview and CPrintDialog etc code has been written and these functions is being called.
if it works in case display dc i dnt knw why it doesnt work in case of memory dc and copying to display dc
thanks for ur effort.
|
|
|
|
|
Your problem might be the rectangle you use for the drawing. Could it be that your buffer bitmap does not get the right size? Also, doing a print preview isn't exactly the same as drawing on the screen, as far as i know, since printers have different page sizes, DPIs, whatevers, if i were you i would try experimenting with that SetAttribDC[^], try doing MemDC.SetAttribDC(pDC) before you start drawing and MemDC.ReleaseAttribDC()[^] afterwards and see if anything changes.
> The problem with computers is that they do what you tell them to do and not what you want them to do. <
> Life: great graphics, but the gameplay sux. <
|
|
|
|
|
well i did that also but that didnt help.
even wt u r telling can be a case but how to get correct size i just have the info that in case if print preview the pDC i get in OnDraw() is the one with m_bPrinting = 1 means its a printer dc. it might be due to AFXPrintPreview() it runs some code written in mfc itself in viewcore.cpp viewprev.cpp. u might have these files also if u have latest visual studio at
C:\Program Files\Microsoft Visual Studio 9.0\VC\atlmfc\src\mfc
ny idea how can i check the size m passing to buffer dc is correct or not
thanks
|
|
|
|
|
Try using CDC::GetDeviceCaps[^], giving it maybe HORZRES and VERTRES or so to get the size of the DC's surface, or maybe HORZSIZE and VERTSIZE. Just note something, when using printer DC's, the size of the surface might be HUGE. Actually, do you really need to draw your print preview on a buffer?
> The problem with computers is that they do what you tell them to do and not what you want them to do. <
> Life: great graphics, but the gameplay sux. <
|
|
|
|
|
i have did this. even though prob is same nothing is coming.
i have to use buffer as there is so much flicering comin in print preview window as resizing and all due to direct drawing on display dc.
if (pDC->IsKindOf(RUNTIME_CLASS(CPreviewDC)))
{
//using buffer and copying from mem DC to preview DC causes problem. so
//we directly draw on preview DC during preview.
memDC.CreateCompatibleDC(pDC);
bmp.CreateCompatibleBitmap(pDC, pDC->GetDeviceCaps(HORZRES), pDC->;GetDeviceCaps(VERTRES));
memDC.SelectObject(&bmp);
memDC.SetAttribDC(pDC->;GetSafeHdc());
//pMemDC = pDC;
pMemDC = &memDC;
}
|
|
|
|
|
Does BitBlt succeed at all?
> The problem with computers is that they do what you tell them to do and not what you want them to do. <
> Life: great graphics, but the gameplay sux. <
|
|
|
|
|
yes it does in both the case (display and print preview)
the rc i m passing to bitblt and CreateCompatibleBmp is of 6400 ,4900 i think this is the one u were talking abt so its correct right ??
in case of printing m already setting m_rcPrintrect as above size
|
|
|
|
|
_T("No name") wrote: my drawing coming distorted
What you meant by distorted?
You may try SetStretchBltMode, with HALFTONE, then use StretchBlt.
(If my guess is correct... )
- ns -
|
|
|
|
|
i tried this also this didnt help. i have uploaded the code i think this will help u
thx
|
|
|
|
|
I have a class that implements two interfaces, IA and IB. Some portions of the code manipulate objects through IA pointers and other parts of the code manipulate objects through IB pointers. I would like those IA and IB clients to hold shared_ptr(ia) and shared_ptr(ib) instead of raw pointers. However, the ref count of one of the pointers could reach 0 before the other, delete the object and make the other pointer invalid.
How do I practice safe use of smart ptrs of different interfaces to the same object?
Thanks
|
|
|
|
|
If you use Boost shared_ptr s[^], the reference count is held on the object rather than the interface you're using - consider this code:
#include <boost/shared_ptr.hpp>
#include <iostream>
class A
{
public:
~A() { std::cout << "A::~A()\n"; }
};
class B
{
public:
~B() { std::cout << "B::~B()\n"; }
};
class C : public A, public B
{
public:
~C() { std::cout << "C::~C()\n"; }
};
int main(int argc, char** argv)
{
{
std::cout << "Enter pC\n";
boost::shared_ptr<C> pC(new C);
{
std::cout << "Enter pB\n";
boost::shared_ptr<B> pB (pC);
std::cout << "Exit pB\n";
}
{
std::cout << "Enter pA\n";
boost::shared_ptr<A> pA (pC);
std::cout << "Exit pA\n";
}
std::cout << "Exit pC\n";
}
}
The output is
Enter pC
Enter pB
Exit pB
Enter pA
Exit pA
Exit pC
C::~C()
B::~B()
A::~A()
so, the object is only destructed once, when pC goes out of scope.
|
|
|
|
|
Thanks for the response. Unfortunately, I was not clear enough in my question. The original shared_ptr is a pointer to one of the base classes. Here is an example:
shared_ptr<Base1> pBase1(new Derived);
Derived* pDerived = dynamic_cast<Derived*>(pBase1.get());
shared_ptr<Base2> pBase2(pDerived);
This code will crash but I want to accomplish the spirit of the code:
1) Create an object.
2) Somewhere in client code, have safe access to the object in terms of a pointer to one base class.
3) Somewhere else in client code, have safe access to the object in terms of a pointer to another base class.
This appears to be more of a design question than a pointer/multiple inheritance use question.
Thanks
|
|
|
|
|
If you're using Boost shared pointers, that's easy enough to resolve - use boost::dynamic_pointer_cast , documented on this page[^], in place of dynamic_cast . Here's some code of similar shape to yours that uses it:
#include <iostream>
#include <boost/shared_ptr.hpp>
class A
{
public:
virtual ~A() { std::cout << "~A\n"; }
};
class B
{
public:
virtual ~B() { std::cout << "~B\n"; }
};
class C : public A, public B
{
public:
virtual ~C() { std::cout << "~C\n"; }
};
int main(int argc, char** argv)
{
boost::shared_ptr<A> pA(new C);
boost::shared_ptr<C> pC = boost::dynamic_pointer_cast<C>(pA);
boost::shared_ptr<B> pB(pC);
}
|
|
|
|
|
Thanks a bunch. That works great!
|
|
|
|
|
How can I convert a CString variable to a binary string, so that I can write it to the text file in binary form and Read CString form?
i am converting CString to Binary like this.
CString strRet;
CString strname="ABCDEF";
for (int i = 0; i < strname.GetLength(); ++i) {
CString str;
str.Format("%2.2x", strname[i]);
strRet += str;
}
AfxMessageBox(strRet);
But i don't know how to convert this Binary values to CString?
Plz help me
|
|
|
|
|
Your function merely transforms the original string in another one, the latter containing the character codes of the former one, represented as two-digits hexadecimal values. What do you want to do, really?
If the Lord God Almighty had consulted me before embarking upon the Creation, I would have recommended something simpler.
-- Alfonso the Wise, 13th Century King of Castile.
This is going on my arrogant assumptions. You may have a superb reason why I'm completely wrong.
-- Iain Clarke
[My articles]
|
|
|
|
|
Do you want that user should not be able to read the strings when they open the files in notepad or some text editor?? If yes i will suggest to use some simple encryption rather than converting the string to hex/binary.
Regards,
Sandip.
|
|
|
|
|
in your code you can do easily this:
CString strname="ABCDEF";//if Unicode use the macro => TEXT("ABCDEF")
//no transforming needed
AfxMessageBox(strname);
writing and reading data have only to be compatible. There is a tiny MFC class for it:
http://msdn.microsoft.com/en-us/library/aa314304(VS.60).aspx[^]
try it out and step-debug in the sources.
Greetings from Germany
|
|
|
|
|
KarstenK wrote: ";//if Unicode use the macro => TEXT("ABCDEF")
no, use TEXT() ( or _T() ) anytime you use CString with literals.
if using CStringW, then prepend the literals with L, and if using CStringA, then don't do much as writing the literal like you normally do.
_T() will extend to the unicode or ansi version depending on the compilation mode, so you don't have to bother about it.
|
|
|
|
|
Hi,
I have created multiple dockablepanes in the childframe, but I have one problem. There are four dockable panes in my childframe and the splitter for only the last dockable pane is only seen. When I double click or adjust the last pane's splitter or click on the "Auto Hide" button of any of the dockable panes or close any of the dockable panes, then only all the other panes splitters are seen. Please let me know as how to make the splitters of all the panes visible as soon as run my application.
Thanks in advance.
Taruni
|
|
|
|
|
Hi All
can any one help me how to encrupt string and write in text file and dcrypt string at the time file read?Plz help me
|
|
|
|
|
Did you see Encrypt Sample: File Encryption[^]
Of one Essence is the human race
thus has Creation put the base
One Limb impacted is sufficient
For all Others to feel the Mace
(Saadi )
|
|
|
|
|
If it is really 'Urgent' then you may consider to rent a coder.
If the Lord God Almighty had consulted me before embarking upon the Creation, I would have recommended something simpler.
-- Alfonso the Wise, 13th Century King of Castile.
This is going on my arrogant assumptions. You may have a superb reason why I'm completely wrong.
-- Iain Clarke
[My articles]
|
|
|
|